In this paper, we consider a block-structured convex optimization model,
where in the objective the block variables are nonseparable and they are further linearly
coupled in the constraint. For the 2-block case, we propose a number of first-order
algorithms to solve this model. First, the alternating direction method of multipliers
(ADMM) is extended, assuming that it is easy to optimize the augmented Lagrangian
function with one block of variables at each time while fixing the other block. We
prove that O(1/t) iteration complexity bound holds under suitable conditions, where
t is the number of iterations. If the subroutines of the ADMM cannot be implemented,
then we propose new alternative algorithms to be called alternating proximal gradient
method of multipliers, alternating gradient projection method of multipliers, and the
hybrids thereof. Under suitable conditions, the O(1/t) iteration complexity bound is
shown to hold for all the newly proposed algorithms. Finally, we extend the analysis
for the ADMM to the general multi-block case.