Subgradient — Subgradient generalizes the notion of gradient/derivative and quantifies the rate of change of non-differentiable/non-smooth function ([1], section 8.1 in [2]). For a real-valued function, the
α-strong convexity, β-strong smoothness — Strong convexity often allows for optimization algorithms that converge very quickly to an $\epsilon$-optimum (rf. FISTA and NESTA). This post will cover some fundamentals of