Helping to clean up Tokamak reactor
A few years back around 2010 our team was asked to build a large X-ray machine for the UK JET Tokamak reactor. It wasn't any kind of equipment used in hospitals. In fact, it was a very fast X-ray camera that looks into the core of the thermonuclear reactor in search of impurities. It took us 3 years and a lot of fun to build that stuff. The reason to build the stuff was to measure the concentration of impurities that come from the Tokamak construction elements like walls. The temperature inside is very high - roughly 100mln K. It's much hotter than the Sun. The reactor produces a lot of fast neutrons that hit the walls and produce debris that are then melted by the plasma and emit X-ray radiation. The impurities are mainly Iron and Nickel. Stainless steel is made of them. It's critical for long term Tokamak and future nuclear synthesis power plants to manage with such impurities. The Tokamaks have some kind of exhaust pipe. It is kind of magnetic trap that collects such impurities, but to make it operational, scientists need to know how they move and how much of them are produced. In this project, we built the spectrometer in an old-fashioned way. Just crystal and a linear detector array.
More on the principle is here https://slideplayer.com/slide/5265519/
Here is the detector system attached to the spectrometer arm.
The system consists of two 256-channel X-ray detectors. The detectors are optimised for 2.4keV and 7.8keV energies.
Detectors are based on triple layer Gas Electron Multipliers. Diagram below presents some details on their construction
The GEM detector is attached to small 3U chassis that holds Analog Front End modules.
Signals from the detector assembly are routed using 16 dedicated high speed cables to processing boxes based on 21FPGA chips each
Here is how the processing box looks like inside. On the left side there is ATX PSU (covered by black sheet) and the ITX mainboard. On the right side backplane with 4 FMC carriers are installed.
Each backplane holds 4 carriers.
Each carrier holds 4 FMC modules. All boards and mechanics were developed by our group. Below is the FMC module without the front panel and the connector. The module is heavily packed and has only 4 layers. We have here: - 8 x dual channel 130MS/s 10 bit ADC - 16 differential line amplifiers - 8 x 16 bit isolation buffers - Spartan 6 FPGA - DDR3 SDRAM memory - 4 channel SPI DAC - 4 opamps that boost DAC output to 10V - 5V SMPS + several LDOs - TTL buffers/translators. - FMC LPC connector.
So we have 256 channels, each generating roughly 1Gbit/s of data. That's quite a lot of data to process. No way to digest it in the CPU. First of all, the CPU would need to have such amount of data delivered to it. Then enough memory to store it for processing. And we need data in real time!. And remember, it was year 2010! We had no other choice than FPGAs. Spartan 6 was just announced and we were one of the first who was brave enough to test them! Below is a photo of me with Xilinx ISE and 21 Spartan 6 FPGA devices connected as one JTAG chain.
So the tasks were run in parallel. FPGAs on FMCs were responsible for triggering and event identification. They computed pulse energy, position and time. Biggest Spartan 6 FPGA on the carrier board was used to calculate histograms and communication with remaining FPGAs. Communication was a real challenge. We used both GTP and LVDS links and the interfaces consumed a lot of the FPGA resources. We implemented error checking and correction mechanisms and data quality monitors on every stage. We managed to use roughly 90% of FPGA resources. It was one of the reasons why we decided to build a new system for our next job. The results are quite impressive. We managed to build Xray spectrometer with 10ms timing resolution that works with estimation of every photon energy .
Apart from processing system and Front Ends, we also developed dedicated high voltage power supply in cooperation with Creotech company.
A lot of scientific paper were (and still are) produced.
The system was installed in 2013 and it still works without issues today (Jan 2019).