This article is study devoted to the analysis and assessment of data loss in network packets during the modeling of Internet of Things systems is the focus of this study. The author offers a model of an automobile radar that uses the IPv4 protocol and LTE standard to function in an IOT environment. The high rate of death, injury, and damages from collisions with moving objects, including other vehicles, animals, and pedestrians, makes the project relevant. The time buffer required for the driver to react is created by the radar warning signal. Given that the concept’s implementation necessitates quick and high-quality data exchange, packet loss is an important issue. The process is viewed by the author as a function of multiple variables, all of which are functions. The variables included the function of the loss factor’s dependence on packet size, the function of the loss factors dependence on node speed, and the function of the loss factor’s dependence on signal strength. Dependencies were established, empirical data were gathered, and the results were graphically and analytically presented. The binomial interpolation method was applied to get final result, allowing functions based on data collected empirically formed. The benefits of the binomial interpolation method, which formed the foundation for its selection as a mathematical data processing technique, include its applicability to a wide range of function types and its relative simplicity. But this method’s drawback might be its poor accuracy when approximating complex. The article’s findings and recommendations may be helpful to researchers and developers working on IoT designing and modeling communication systems for IoT devices and network technology-based researching packet loss in IoT networks and transmission efficiency. © 2025 Elsevier B.V., All rights reserved.