Optical processing: a paradigm shift for CFD?
By   |  September 27, 2014

In electronic systems, the FT is a highly processor-intensive, parallel operation based on FFT (Fast Fourier Transform) algorithms which do not scale well with serial electronic processors. “However”, Dr New adds, “using our optical solution, the processing time remains the same, as the processor calculates at the speed of light.”

Derivative functions are calculated using the same FT-based process employed in electronic systems – wherein high-end solvers apply successive FT stages to solve Navier-Stokes equations.

In the case of the optical system, however, the process remains an “Order 1” operation – that is: the time taken to process one stage will be the same as the time taken to process multiple stages (as processing is carried out in parallel).

Dr New explains, “By adopting a dynamically addressable approach we can reconfigure the optical system and flip the functionality between number crunching and pattern recognition, allowing the data produced to be analyzed and the results fed back into the model. Post processing of results can take place in real-time, with live ‘video feed’ outputs allowing the user to monitor the formation of structures within the flow as they happen, allowing changes to be detected and structures identified right away.

By enabling real-time transient analysis of flow data, the Optalysys system eliminates the need to periodically dump data, saving significant disc access time and reducing electronic bottlenecks. And, because it employs inherently low power devices, it will cost a fraction of the colossal $21m pa required by Tianhe-2 – currently the world’s current fastest supercomputer manufactured by China’s National University of Defense.

A disruptive pricing

Although the commercial launch of the company’s first Big Data Analysis Unit is still four years away (scheduled for summer 2018), Optalysys’ prototype, which meets NASA Technology
Readiness Level 4, is scheduled for completion by January of next year and is expected to run at over 340 gigaflops – enabling it to analyze large data sets, and perform post-processing tasks, in a laboratory environment.

And, unsurprisingly, early conversations with customers have been extremely positive. Whilst today’s “cheapest” supercomputer, the Cray XC 30 AC, costs between £300k and £1.8m and delivers a maximum processing power of 176 teraflops, Optalysys’ Big Data Analysis Unit will deliver 1.32 petaflops in year one, increasing to 300 petaflops in year five – and will carry a disruptive list price.

Whilst other future technologies such as the D-Wave 2 quantum computer are set to further enhance processing power – to around 1.5 petaflops in the case of the D-Wave – they are designed to address specific tasks, such as encryption. And at a cost – of around £9m.

Actually, Optalysys’ closest competitors – in terms of the ability to process HPC and Big Data workflows – are more likely to be the major chip manufacturers. But, given the fact that Optalysys represents a genuine paradigm shift, Dr New believes these organizations are more likely to partner with the company going forward.

Dr New concludes, “Ultimately, we believe the supercomputing market will grow more quickly given the significant cost savings our optical processing technology will provide – bringing supercomputer power to a much broader market.

We’ve certainly heard that before, but this time the perspective seems really promising.



© HPC Today 2021 - All rights reserved.

Thank you for reading HPC Today.

Express poll

Do you use multi-screen
visualization technologies?

Industry news

Brands / Products index