In this paper we make an overview of s-step Conjugate Gradient (CG) and develop a novel formulation for s-step BiConjugate Gradient Stabilized (BiCGStab) iterative method. Also, we show how to add preconditioning to both of these s-step schemes. We explain their relationship to the standard, block and communication-avoiding counterparts. Finally, we explore their advantages, such as the availability of matrix-power kernel and use of block-dot products, as well as their drawbacks, such as the extra computational overhead and numerical stability related to the use of monomial basis for Krylov subspace. We note that the mathematical techniques used in this paper can be applied to other methods in sparse linear algebra and related fields, such as optimization.