Altair_Blog_hero_1920x225

Featured Articles

Executive Q&A: Carlos Labra on Altair® EDEM™ Achieving the First 1 Billion Particle Discrete Element Method Simulation

Altair® EDEM™ is the global market leader in discrete element method (DEM) technology for bulk and granular material simulation. Blue-chip companies worldwide use EDEM to optimize equipment design, increase productivity, reduce operational costs, shorten product development cycles, and drive product innovation. 

Recently, we sat down with Carlos Labra, senior EDEM product director, to hear more about the work that he and his team are doing – namely, his team’s having achieved the first 1 billion particle DEM simulation and the opportunity they had with Google Cloud to try out the new Google A3 virtual machines. 



Q: For those unfamiliar with bulk and granular materials, can you explain what this means in terms of industrial application? Where does DEM fit?

Carlos Labra: These types of materials are everywhere – in many industries – in the form of rocks, soils, powders, crops, and others, representing around 70% of raw materials in industry with a huge energy consumption of energy for handling and processing material. The design and optimization of equipment designed to handle these materials is a complex challenge to engineers, but small improvements have a tremendous impact in terms of cost savings, energy consumption, and time to market. 

Similar to computational fluid dynamics (CFD) for fluids or finite element analysis (FEA) for structural analysis, DEM is used to simulate particle behavior and equipment that interacts with particles. DEM helps engineers understand how to optimize processes and machines.

 

Q: EDEM has been at the forefront of DEM for nearly 20 years now. What distinguishes EDEM as the market leader for DEM technology?

CL: DEM has been around since the mid-1970s; however, for many years, the technology’s complexity and computational cost made it very difficult to implement compared with other traditional CAE technologies.

When it arrived, EDEM was developed as an easy-to-use, desktop-oriented tool that allows users to implement DEM simulations much more easily, similar to how other CAE tools in the market operated. This was a huge difference from other tools available at the time and helped DEM technology gain traction in industry by combining best-in-class physics and performance within an easy-to-use environment. 

 

Q: Simulation speed has been a hot topic in recent years. How has GPU technology made an impact for EDEM users?

CL: As I mentioned earlier, DEM’s main limitation compared with other simulation technologies is its computational costs. This is because DEM computes every single interaction between particles in a system with thousands or millions of particles. 

As computing hardware evolved, DEM became a more interesting technology for industry. High-performance computing (HPC) was the obvious starting place, but it was still very expensive and not many people had access to it. To run a very large simulation, you needed to access an HPC facility with thousands of cores of processors working together, which entailed huge costs. When GPU technology arrived and began accelerating computation, this was a game changer and allowed us to use a completely different approach for DEM. For the first time, it allowed us to push industrial scale simulations and complexity in a reasonable time using only a single desktop.

 

Q: Altair has an amazing network of partners. What was the main motivation behind your recent collaboration with Google Cloud?

CL: I think it was more of an opportunity. We have cloud technology, but the recent evolution of GPUs and artificial intelligence (AI)-based GPUs developed by others has been another game changer for simulation. The idea of interconnecting multiple GPUs to push simulation size and complexity was very attractive to us.

Last year, we released multi-GPU solvers implemented in EDEM, which opened a huge door for us. Today, you can run simulations using multiple GPUs, but if you really want to explore the limits of particle simulation, you need a big cloud system. In industry, only a few providers of that level of technology are available (not considering traditional HPC, which is more suited to academia).

When Google launched the A3 virtual machine, we realized that we were in a position to use the largest available GPU computation capability to explore the limits of DEM simulations in the cloud without the need to implement complex HPC multi-node communications and data structures. This provides a better, more cost-effective alternative for our customers.

This pushed us to seek the first simulation of a 1 billion particle system in an industrial environment, which we achieved. It is important to highlight that this is not the first time a simulation of this size has been achieved overall, but it is the first one made in an environment that everyone in the industry can access, compared to HPC-dedicated systems.

 

Q: What were the key takeaways from this project with Google Cloud? How are EDEM users benefitting from it? 

CL: What we have learned is that with this type of hardware capability, we no longer have to utilize HPC facilities with millions of cores to run large DEM simulations. Now, with a single node that has a small group of top-level GPUs, you can have the same computational capability, which is far more cost-effective while still being easy to use like a desktop application. This means we can focus on delivering more physics and advanced capabilities for large applications.

Also, we know that not all our customers are interested in running simulations with 1 billion particles or more; however, this project allowed us not just to explore the limits of simulation in terms of size, but also how we get there. The specific cases we used were designed to understand the resources required under certain conditions, such as GPU memory. With that, we can map the size of the simulation versus the hardware requirements in this type of infrastructure. That means that when I have a customer interested in trying cloud or GPU technology, we can better advise them on the requirements based on their specific simulation setup.

 

Q: What do you expect the next steps are to further develop the capabilities of large-scale simulations?

CL: Since we have shown that it is possible to run a simulation of this size in a commercial environment, we are now planning to build on that by refining our offerings specifically for the cloud. Using this project as a reference, we have learned not just about the amount of memory required, or CPU/GPU, but also about what is required to save a huge amount of data and visualize these large simulations’ results. This has helped us identify areas we can improve to make this much easier and, hence, what capabilities are needed to handle these cases. 

This has also increased the appetite for adding more physics and complexity to simulations, in combination with the continued growth of size. Because of this, we are also working on a smarter way to use the simulation and be prepared when even larger simulations of 2 or possibly 3 billion particles are required with a more robust cloud-based offering for Altair users.

 

Q: What are the big trends regarding the use of DEM in industry right now?

CL: Simulation scale and complexity continue to be some of the key trends in DEM, especially with the rapid development of GPU hardware. Because of this, we are working on using these new technologies to gain raw performance in our solver, and we are also looking to make smarter simulations that allow our users to reduce the effort on a whole project when multiple simulations are required.

Also, with the rapid growth of AI, the possibility of building digital twins in different spaces is gaining traction as a way to provide decision-making support to operators and find optimal design solutions for engineers. This is an area where EDEM benefits from being part of Altair’s ecosystem. We can combine EDEM with Altair’s array of other tools to make digital twins a reality in any industry. Today, we are using EDEM to generate synthetic data that can then be used to train a machine learning/AI model. These models can then be used to optimize the simulated process without the need for many simulations; they can also be integrated into a multiphysics solution with other simulation technologies, reducing the cost of connecting multiple solvers.

 

Q: A lot of readers will be wondering if they can try Google Cloud for themselves. How can users get their hands on this technology?

CL: As part of Altair’s partnership with Google, we have agreed to offer the possibility of applying for a trial of Google Cloud with EDEM. I would encourage people to register quickly before the offer expires.



To learn more about Altair and Google Cloud’s collaboration, take a look at Carlos Labra’s latest blog on the Altair Community.