Sentiment analysis in nancial news is critical for assessing market trends and informing data-driven decisions. This study evaluates the e ectiveness of multilingual sentiment analysis in Russian and English nancial news using BERT-based models (mBERT, Distil-mBERT, RuBERT), chosen for their NLP performance and manageable size. We compare two training strategies: (1) domain adaptation via masked language modeling (MLM) on nancial corpora (GAZETA.ru, RBK, Daily Financial News), followed by ne-tuning for sentiment classi cation, and (2) direct ne-tuning on sentiment classi cation without prior domain adaptation. For sentiment classi cation tasks, we used annotated datasets (FiNeS for Russian, Financial Phrasebook for English), selected for their domain relevance and annotation quality. To ensure language balance, all datasets were translated from Russian to English and vice versa. Results show that domain adaptation signi cantly improved large models, with mBERT and RuBERT achieving 82-83% accuracy, compared to 74-79% with direct ne-tuning. In contrast, Distil-mBERT performed identically (82% accuracy) with both approaches, suggesting that smaller models may not bene t from domain adaptation. These ndings highlight the importance of model size when selecting training strategies, with domain adaptation via MLM proving critical for larger architectures but redundant for distilled models.