Carsten Frederiksen / Miha Krajnc

Wednesday, November 6, 2024 · 0 min read

by DBF Edvard Rusjan Team, University of Ljubljana

Simulating Aerodynamic Performance and Propulsion Systems for Model Aircraft

Simulating aerodynamic performance and propulsion systems for model aircraft is essential to the design process. For the Edvard Rusjan team, which competes in the prestigious Design, Build, Fly (DBF) competition, simulations play a pivotal role in developing their aircraft. Validation through real-world data is crucial to ensure the accuracy and reliability of these simulations. The team relies on measurement systems to gather essential data on thrust, lift, and power consumption to optimize the aircraft design. Dewesoft provided the team with the tools to enhance their simulations and improve overall performance.

Named after a Slovenian aviation pioneer, the Edvard Rusjan Team is a team of mechanical engineering students from the University of Ljubljana. They participate in the Design-Build-Fly competition in the USA annually. We compete with students from all over the world to design, build, and fly radio-controlled aircraft. The team utilized Dewesoft’s advanced equipment to refine their simulations, collect precise data, and ultimately enhance their aircraft's design for the DBF competition.

The DBF competition

The Design, Build, Fly (DBF) competition is an annual event that draws participation from over 100 university teams worldwide. AIAA Applied Aerodynamics, Aircraft Design, Design Engineering, and Flight Test Technical Committees started DBF in 1996 as an opportunity for university students to apply real-world aircraft design experience, allowing them to validate their analytic studies. 

The competition challenges teams to design, construct, and fly a remote-controlled aircraft based on specific rules that change each year. DBF is divided into two main parts: the report, which documents the design process and testing throughout the year, and the fly-off, where the aircraft’s performance is demonstrated in flight.

The team must submit a technical report two months before the competition. This report covers team organization, the aircraft’s conceptual and detailed construction, the simulation and testing process, and technical drawings.

At the competition, the aircraft must first pass a technical inspection. Then, it completes four “missions” or tasks. One is land-based, demonstrating the operation of all essential mechanisms, often at high speed. The other three missions are flight tests.

The report and all four missions are scored. The sum of the points gives us the ranking. Both components contribute to the team’s final score, with design and optimization being crucial elements of the report.

Figure 1. The DBF Edvard Rusjan Slovenian team at the DBF competition.

Real-world data

The Edvard Rusjan team has a long-standing history of success in the DBF competition, frequently placing among the top three teams. This success is attributed to their ability to design highly optimized aircraft through custom simulations. These simulations model the performance of millions of potential aircraft designs, evaluating them based on lift, drag, propulsion efficiency, and overall flight performance.

The team understands that the accuracy of their simulations depends heavily on the quality of the input data. To ensure their simulations reflect real-world performance, the team uses Dewesoft measurement systems to gather empirical data, which is then used to validate and refine their models.

Aircraft design simulations require various input data, such as airfoil lift and drag coefficients, motor efficiency, battery performance, and other factors that affect how the aircraft performs in flight.​ Any inaccuracies in this data can lead to flawed simulations, resulting in suboptimal design choices. Therefore, the Edvard Rusjan team consistently strives to obtain the highest quality datasets to ensure their simulations are as accurate as possible.

In their most recent design cycle, the team challenged themselves to validate their simulations for propulsion and aerodynamic performance. This required extensive testing under real-world conditions to gather accurate data on thrust, lift, power consumption, and airflow. 

The mobile test rig

The Edvard Rusjan team needed to think outside the box to conduct their tests without access to a wind tunnel. The team devised an innovative mobile test rig that could be mounted on the roof of a car. By driving the vehicle at various speeds, they could generate airflow over the rig to simulate wind tunnel conditions in an open environment. This setup allowed the team to conduct propulsion and aerodynamic tests while controlling key variables such as airflow speed and motor throttle.

Hardware and software used

  • Windows PC 

  • DewesoftX data acquisition software version 2024.1

  • Dewesoft SIRIUSi-8xUNI Universal Strain Gauge, IEPE, voltage, RTD, resistance, and current Aamplifier and data acquisition system

  • Car inverter - to power the the SIRIUSi-8xUNI 

  • Anemometer to measure true windspeed - connected to the SIRIUSi-8xUNI 

  • Current clamps to measure the used current - connected to the SIRIUSi-8xUNI 

  • STG connector/DSI adapter to measure voltage - connected to the SIRIUSi-8xUNI 

  • Quarter bridge strain gauge to measure the exerted force - connected to the SIRIUSi-8xUNI 

The team outfitted the test rig with various sensors to ensure accurate data collection. These included strain gauges to measure the forces exerted on the rig, an anemometer to measure wind speed and sensors to monitor current and voltage. The key to the success of this setup was the Dewesoft SIRIUS-UNI device, which provided support for all the necessary sensor inputs.

Figure 2. The Dewesoft SIRIUSi-8xUNI device facilitated real-time data collection.

The SIRIUSi-8xUNI I device was installed inside the vehicle and connected to a laptop, allowing the operator to monitor sensor values in real time using DewesoftX software. This setup simplified data acquisition and enabled the team to detect and resolve any issues during testing quickly.

Figure 3. The SIRIUS-UNI device was installed inside the vehicle and connected to a laptop, allowing the operator to monitor sensor values in real-time.

Propulsion testing

The propulsion test rig had a sturdy base and a vertical metal bar extending upwards. The vertical bar was equipped with strain gauges to measure the perpendicular forces acting upon it during the tests. A motor was mounted at the top of the bar, pointing forward toward the vehicle's movement. The base also housed a battery to power the engine during testing. To monitor the motor’s power consumption, the team connected current clamps and a voltage meter, ensuring they could capture critical data on energy usage.

The team added an anemometer to the rig to measure wind speed as the vehicle accelerated. By comparing wind speed with motor performance at different throttle settings, they gained valuable insights into how thrust generation relates to energy consumption.

When data collection was stopped, the propulsion tests were conducted by accelerating the vehicle up to 90 km/h. The recorded data was processed to remove noise from the strain gauges. Then, the baseline drag curve of the rig was used to adjust for the propulsion force created by the motor.

Figure 4. The mobile test rig with strain gauges, a motor, and a battery on the roof of a car.

Propulsion test results

The propulsion test results were compared to the existing data in the team’s simulation models. A significant discrepancy was initially observed, with an average error of approximately 45% and more significant variance at higher throttle settings. This discrepancy highlighted the need for more accurate input data in the simulations, particularly for motor and propulsion characteristics.

Figure 5. The propulsion data was collected during various tests. Plots show the measured thrust force and current draw at different speeds and motor/propeller combinations.

The team reran their simulations using the corrected data from the propulsion tests. The updated results revealed a slightly different optimal design for the aircraft, which profoundly impacted the overall configuration. These insights allowed the team to make targeted adjustments to the propulsion system, resulting in a more efficient and high-performing aircraft design. The ability to refine simulations based on real-world data was crucial in ensuring the aircraft would perform well in the DBF competition.

Aerodynamic testing

In addition to propulsion testing, the team also needed to validate their simulations for aerodynamic performance, specifically lift generation. The aerodynamic test rig was designed to measure the lift generated by the main wing at different speeds and flap settings. This data was essential for determining the aircraft's optimal takeoff speeds and distances.

To conduct these tests, the team repurposed the metal bar from the propulsion rig by reorienting it horizontally at the front of the vehicle. This ensured the wing remained outside the car’s aerodynamic influence, allowing for accurate lift measurements. Once again, an anemometer was used to measure the airflow over the wing, and the SIRIUSi-8xUNI device facilitated real-time data collection.

Figure 6. The aerodynamic test rig was designed to measure the main wing lift at different speeds and flap settings.

The aerodynamic tests followed a procedure similar to that used for the propulsion tests. The vehicle was accelerated to 90 km/h, after which data collection was stopped. The lift performance data was then compared to the team’s simulation results.

Aerodynamic test results

The lift performance data collected during the tests was compared to the simulation predictions for takeoff speeds. The results showed high accuracy, with discrepancies of at most 12%. This close alignment between the real-world data and the simulations gave the team confidence in the accuracy of their aerodynamic models. 

The minor difference in results was attributed to the ground effect, which occurs during low-altitude flight and is not accounted for in the simulations. Despite this, the simulations were deemed well-defined, and the data gathered from the aerodynamic tests was used to refine the design further.

Figure 7. The lift performance data collected during the tests compared to the simulation predictions for takeoff speeds.

Conclusion

Dewesoft equipment greatly enhanced the Edvard Rusjan team’s ability to collect high-quality data, validate simulations, and optimize its aircraft design. The intuitive, user-friendly design of Dewesoft’s device made data collection smooth and efficient, enabling the team to focus on analyzing results instead of handling technical issues.

The propulsion and aerodynamic test results gave the team essential insights, allowing them to improve their simulations and boost the aircraft’s performance. Quickly spotting discrepancies in the simulation data and making design adjustments was invaluable for preparing for the DBF competition. With Dewesoft’s real-time monitoring, any testing issues could be immediately addressed, adding to the accuracy and reliability of the data collected.​⬤

In conclusion, Dewesoft’s measurement systems were essential to the Edvard Rusjan team’s success in the DBF competition. Dewesoft helped the team connect their simulations with actual performance by delivering accurate, real-world data, allowing them to design a top-performing aircraft ready for high-level competition.