If 2023 has shown us anything, it is that semiconductor memory is not a field for the timid. The year 2024 has great potential for memory producers, but it will also present some challenges for OEMs and IDMs who will have to bid farewell to the attractive costs of 2023.
The use of flash memory for storage will not stop as long as flash chips remain affordable. However, flash is not limited to storage; other cutting-edge memory technologies are also becoming more and more popular. Since NAND flash is a nonvolatile memory, it retains the last piece of data it used even after the machine is powered off. When power is lost, data in the system memory that is currently based on dynamic RAM (DRAM) and utilized for active computation is lost.
In contemporary computer systems, semiconductor memory technologies are essential since they are the main method of storing and retrieving digital data. These technologies cover a wide variety of memory types, each with special qualities appropriate for certain uses.
The need for increased capacity, accelerated speed, and reduced power consumption is pushing memory technology innovation. More sophisticated memory solutions are also required due to the growth of data-intensive applications like big data analytics, machine learning, and artificial intelligence (ML). Because of this, the semiconductor memory industry is anticipated to continue growing and to be shaped by continuous advancements.
The arrangement of various memory types in computer systems, from slower but bigger main memory and permanent storage to high-speed, low-capacity registers and caches, is known as the memory hierarchy.
The lines separating this hierarchy are becoming less distinct, nevertheless, due to new developments like Compute Express Link (CXL). Processors may access different kinds of memory and accelerators as if they were a part of the CPU’s memory area thanks to the high-speed connection standard CXL. More adaptable and effective data transfer across various memory types—such as working memory, storage-class memory, and even AI accelerators like GPUs or FPGAs—is made possible by this design.
This results in a loosening of the boundaries between the various tiers of the memory hierarchy and increased system-wide integration and accessibility of memory resources. The blurring of boundaries creates potential for more unified and adaptable memory architecture, which in turn allows data-intensive applications to utilize increased performance, energy efficiency, and scalability in modern computer systems.
Nevertheless, the underlying memory device technologies continue to serve as the essential building blocks of such a flexible memory architecture. The popular and developing memory device technologies are outlined here. Future trends for these technologies are examined together with the state-of-the-art from the industry.
DDR5 Is Going To Be Widely Used
The average DRAM capacity is expected to rise at a pace of about 12.4% each year. The next Intel Meteor Lake CPUs, which will go into general production in 2024, are one of the causes. The only memory used on this platform is DDR5 and LPDDR5. Major semiconductor manufacturers are raising the amount of DDR5 memory that is available as a result. This suggests that the second half of 2024 will see the widespread use of DDR5. Furthermore, DDR5 will soon emerge as a major memory technology in embedded and industrial systems.
Memory Requirement Driven by AI
Since Chat GPT was introduced, generative AI applications have been more popular, which has increased demand for AI servers. By the end of 2023, almost 1.2 million AI servers, according to Trendforce’s prediction, will be operational. That is a 37.7% rise from the previous year. Additionally, according to Trendforce, this percentage will increase by 38% in 2024, accounting for 12% of total server sales in the form of AI servers.
Though some may believe this is just a passing fad, according to a recent Gartner poll, only 2% of industry leaders said they had no intentions to experiment with generative AI. Nearly 70% of people thought that GenAI’s advantages exceeded its risks.
Among DRAM products, HBM is particularly important for AI. The need for HBM is predicted to rise dramatically as training models and applications get more complicated. Given that the average unit price of HBM is significantly higher than that of other DRAM products, it is anticipated that HBM will rise at an annual pace of 172% by 2024.
Sustainability Drive Among Memory Manufacturers
The goal of memory makers is to produce products with zero emissions. They are also always looking for ways to use less water and energy, such as soldering at lower temperatures. Another important goal is lowering the amount of hazardous compounds in memory goods. Even for earlier generations and legacy items, exemptions from the REACH and ROHS rules are being phased out. In 2024, lead-free versions of all DRAM modules—down to DDR1 modules—will be available, in addition to the entirely lead-free DDR5 modules that were first launched in 2023.
References: EVERTIQ