We develop loss functionals for training physics–informed neural networks using variational principles for nonpotential operators. Generally, a quasiclassical variational functional is bounded from above or below, contains derivatives of lower order compared to the order of derivatives in partial differential equation and some boundary conditions are integrated into the functional, which results in lower computational costs when evaluating the functional via Monte Carlo integration. Quasiclassical loss functional of boundary value problem for hyperbolic equation is obtained using the symmetrizing operator by V.M. Shalov. We demonstrate convergence of the neural network training and advantages of quasiclassical loss functional over conventional residual loss functional of boundary value problems for hyperbolic equation.