* Specialized attention module for Wuerstchen. * Reshaping ops. * Attention processor. * Finish the forward pass. * Hook the new attention processor. * Get the prior forward pass to work. * Make it contiguous.