The accelerated method in solving optimization problems has always been an absorbing topic. Based on the fixed-time (FxT) stability of nonlinear dynamical systems, we provide a unified approach for designing FxT gradient flows (FxTGFs). First, a general class of nonlinear functions in designing FxTGFs is provided. A unified method for designing first-order FxTGFs is shown under Polyak-Łjasiewicz inequality assumption, a weaker condition than strong convexity. When there exist both bounded and vanishing disturbances in the gradient flow, a specific class of nonsmooth robust FxTGFs with disturbance rejection is presented. Under the strict convexity assumption, Newton-based FxTGFs is given and further extended to solve time-varying optimization. Besides, the proposed FxTGFs are further used for solving equation-constrained optimization. Moreover, an FxT proximal gradient flow with a wide range of parameters is provided for solving nonsmooth composite optimization. To show the effectiveness of various FxTGFs, the static regret analyses for several typical FxTGFs are also provided in detail. Finally, the proposed FxTGFs are applied to solve two network problems, i.e., the network consensus problem and solving a system linear equations, respectively, from the perspective of optimization. Particularly, by choosing component-wisely sign-preserving functions, these problems can be solved in a distributed way, which extends the existing results. The accelerated convergence and robustness of the proposed FxTGFs are validated in several numerical examples stemming from practical applications.