Building a Marketplace for HPC in the Cloud for Engineers, Scientists and their Service Providers
By and   |  July 31, 2015

Team 118: Coupling In-house FE Code with ANSYS Fluent
This team’s end user was Hubert Dengg from Rolls-Royce, software providers were Wim Slagter and René Kapa from ANSYS, resource providers and team experts were Thomas Gropp and Alexander Heine from CPU 24/7, and Marius Swoboda from Rolls-Royce Deutschland acted as HPC/CAE expert.

A transient aerothermal analysis of a jet engine high pressure compressor assembly has been performed using FEA/CFD coupling technique. The aim of this cloud experiment was to link ANSYS Fluent with an in-house FEA code. The conjugate heat transfer process is very consuming in terms of computing power, especially when 3D CFD models with more than 10 million cells are required. As a consequence, using cloud resources was of great benefit. The computation was performed on a 32-core 2-node system. The calculation was done in cycles in which the FE code and Fluent CFD ran alternating, exchanging their results.

Outsourcing of the computational workload to an external cluster allowed the end user to distribute computing power in an efficient way – especially when the in-house computing resources were already at their limit. Bigger models usually give more detailed insights into the physical behavior of the system. In addition, the end user benefited from the HPC provider’s knowledge of how to setup a cluster, run applications in parallel based on MPI, create a host file, handle licenses, and prepare everything needed for turn-key access to the cluster.

Team 142: Virtual Testing of Severe Service Control Valve
The end-user of this was Mark A. Lobo from Lobo Engineering. Autodesk provided CFD Flex and the supporting cloud infrastructure. And the application experts were Jon den Hartog and Heath Houghton from Autodesk.

Flow control valve specifications include performance ratings in order for a valve to be properly applied in fluid management systems. Control systems sort out input parameters, disturbances and specifications of each piping system component. System response is a function of the accuracy of control valves that respond to signals from the control system. Valve performance ratings provide information to the system designer that can be used to optimize control system response.

The premise of this project was not to explore virtual valve testing and to evaluate the practical and efficient use of CFD by the non-specialist design engineer. As a benchmark, the end user had no prior experience with the Autodesk software no formal training in the software, and he was depended on the included tutorials, help utility, thorough documentation to produce good results and good data.

In this project, over 200 simulations were run in the cloud. Given the runtimes involved and allowing for data download upon completion of the runs, it is possible to be solved within one day. For an engineer with 1 simulation license on a single workstation this would have required 800 hours (approximately 30 days) to complete if the simulations were running nonstop one after another.

Team 156: Pulsatile flow in a Right Coronary Artery Tree
This team consisted of the end-user Prahlad G. Menon, from Carnegie Mellon University, cloud resource providers Amazon AWS and Nephoscale, and Burak Yenier from UberCloud who containerized OpenFOAM and post-processing tool ParaView. In this study, blood flow inside a patient-specific right coronary artery tree (see Figure 11) including one inlet and a total of 7 outflow branches was studied under realistic unsteady flow conditions after segmentation from tomographic MRI slice images obtained across the whole human body of a male volunteer.

The finite volume mesh consisting of 334,652 tetrahedral cells was passed over as input to the icoFoam solver in OpenFOAM to solve the blood flow, motion and equilibrium under the action of external forces.

The UberCloud OpenFOAM container was found to be far more user friendly – in terms of both simplicity and ease of access (via SSH) – than the supercomputing resources use by the end-user on a daily basis. Parallel scalability of the simulated problem good on the remote computing resource; a 2-3 fold speed improvement was noted for 16 core parallelism on the remote UberCloud container in contrast with an equivalent local simulation run on just 4 parallel cores of a local quad-core machine.

Team 165: Wind Turbine Aerodynamics Study
The end-user and CFD expert of this team was Praveen Bhat, a technology consultant from India INIA, the software Provider was Metin Ozen, an ANSYS reseller in California, and the resource provider was ProfitBricks.

The team evaluated wind turbine performance using ANSYS CFX on a 62 core HPC cloud server with 240 GB RAM from ProfitBricks. The ANSYS software was running in UberCloud’s new application container. The CFD simulation is performed to calculate the pressure distribution and velocity profiles around the wind turbine blades with average wind speed of 7 to 8 m/min. Figure 12 highlights the velocity distribution around the wind turbine blades.

The computation requirement for a fine mesh of 2.5 million cells is high and makes it impossible to run on a normal workstation. The HPC cloud allowed for solving very fine mesh models and for drastically reduced solution times of about1.5 hours.

The UberCloud ANSYS container enabled easy access and use of the Cloud server, and the regular UberCloud auto-update module through email provided the huge advantage of continuous monitoring job progress without any requirement to log-in to the server and check the status.

Navigation

<123456>

© HPC Today 2024 - All rights reserved.

Thank you for reading HPC Today.

Express poll

Do you use multi-screen
visualization technologies?

Industry news

Brands / Products index