This paper addresses the suppression of transient artifacts in signals, e.g., biomedical time series. To that end, we distinguish two types of artifact signals. We define 'Type1' artifacts as spikes and sharp, brief waves that adhere to a baseline value of zero. We define 'Type 2' artifacts as comprising approximate step discontinuities. We model a Type 1 artifact as being sparse and having a sparse time-derivative, and a Type 2 artifact as having a sparse time-derivative. We model the observed time series as the sum of a low-pass signal (e.g., a background trend), an artifact signal of each type, and a white Gaussian stochastic process. To jointly estimate the components of the signal model, we formulate a sparse optimization problem and develop a rapidly converging, computationally efficient iterative algorithm denoted TARA ('transient artifact reduction algorithm'). The effectiveness of the approach is illustrated using near infrared spectroscopic time-series data.