Artificial Intelligence Technology Scouting
- By: JAIC Public Affairs
APRIL 2021
This edition will address the below topics:
- Using Generative Adversarial Networks in Minecraft
- Some AI Chips Go Bigger to Get Better
- Large Tech Companies Deploy Trillion Parameter Models
Topics
GAN Minecraft for Simulation
Much publicity has been given to advances in Generative Adversarial Networks, AI models that are capable of generating photorealistic images and videos of events that did not occur. While many have tried to apply these systems to data generation and simulation, research has been limited on their application to simulating a fully rendered 3D environment for simulation.
Researchers at a leading university and leading tech company have developed a GAN-based AI system that takes worlds from the video game Minecraft and turns them into photo-realistic landscapes. Wherever a user turns, and whatever they build with the game’s blocks, is rendered in photo-realistic detail. This technology could one day be used to make the creation of new simulation environments and landscape accessible to anyone capable of learning the basics of Minecraft.
Link: GANcraft: Unsupervised 3D Neural Rendering of Minecraft Worlds | Arxiv
Building Massive Chips for AI
New types of chip design and architecture are being experimented with as researchers and entrepreneurs seek to find ever better dedicated computing resources for AI and High Performance Computing (HPC). Graphics Processing Units (GPUs) are being further refined, and entirely new approaches, like chips with optical channels, are being attempted.
A few companies are attempting to meet the hardware demands of AI by building massive, multi-trillion transistor chips on individual wafers larger than the size of a human head. Larger than half a square-foot, these massive, and incredibly expensive, chips have been deployed by Lawrence Livermore National Lab for nuclear fusion and neuroscience simulations.
Link: Monster Chips add 1.6 Trillion Transistors | IEEE Spectrum
Corporate Recommendation Algorithms Grow Larger than GPT-3
To serve their customers, large tech companies are utilizing Deep Learning Reccomendation Models (DLRMs), massive models that inform things like which advertisements a user on a social-media site is shown, that are over ten trillion parameters in size. This makes them much larger than even the largest language models (e.g. GPT-3).
Researchers at a leading social media company published research showing the details of how they train these models at massive scale. To accomplish this scale of model, researchers use reduced precision communication between servers to limit their bandwidth requirements, and develop new ways of dividing pieces of a model over multiple servers during training. Deploying these algorithms is now among the most resource intensive task in corporate data-centers.
Link: Distributed Training of Deep Learning Recommendation Models | Arxiv
The Artificial Intelligence Technology Scouting Blog is for AI/ML learning and education purposes only. The programs and initiatives mentioned in this blog are not necessarily an endorsement by the U.S. government, the DoD, or the JAIC.