Sponsored by:

Visit AMD Visit Supermicro

Performance Intensive Computing

Capture the full potential of IT

What you need to know about high-performance storage for media & entertainment

Featured content

What you need to know about high-performance storage for media & entertainment

To store, process and share their terabytes of data, media and entertainment content creators need more than your usual storage.

Learn More about this topic
  • Applications:
  • Featured Technologies:

Maintaining fast, efficient and reliable data storage in the age of modern media and entertainment is an increasingly difficult challenge.

Content creators ranging from independent filmmakers to major studios like Netflix and Amazon are churning out enormous amounts of TV shows, movies, video games, and augmented and virtual reality (AR/VR) experiences. Each piece of content must be stored in a way that ensures it’s easy to access, ready to share and fast enough to stream.

This becomes a monumental task when you’re dealing with petabytes of high-resolution footage and graphics. Operating at that scale can overwhelm even the most seasoned professionals.

Those pros must also ensure they have both primary and secondary storage. Primary storage is designed to deliver rapid data retrieval speeds. Secondary storage, on the other hand, provides slower access times and is used for long-term storage.

Seemingly Insurmountable Odds

For media and entertainment production companies, the goal is always the same: speed production and cut costs. That’s why fast, efficient and reliable data storage solutions have become a vital necessity for those who want to survive and thrive in the modern age of media and entertainment.

The amount of data created in a single media project can be staggering.

Each new project uses one or more cameras producing footage with a resolution as high as 8K. And content captured at 8K has 16 times more pixels per frame than traditional HD video. That translates to around 1 terabyte of data for every 1.5 to 2 hours of footage.

For large-scale productions, shooting can continue for weeks, even months. At roughly a terabyte for every 2 hours of shooting, that footage quickly adds up, creating a major data-storage headache.

But wait, there’s more: Your customer’s projects may also include both AR and VR data. High-quality AR/VR can contain hundreds of effects, textures and 3D models, producing data that measures not just in terabytes but petabytes.

Further complicating matters even more, AR/VR data often requires real-time processing, low-latency transfer and multiuser access.

Deploying AI adds yet another dimension. Generative AI (GenAI) now has the ability to create stunning additions to any multimedia project. These may include animated backgrounds, special effects and even virtual actors.

However, AI accounts for some of the most resource-intensive workloads in the world. To meet these stringent demands, not just any storage solution will do.

Extreme Performance Required

For media and entertainment content creators, choosing the right storage solution can be a make-or-break decision. Production companies that produce the highest rate of data must opt for something like the Supermicro H13 Petascale storage server.

The H13 Petascale storage server boasts extreme performance for data-intensive applications. For major content producers, that means high-resolution media editing, AR and VR creation, special effects and the like.

The H13 Petascale storage server is also designed to handle some of the tech industry’s most demanding workloads. These include AI and machine learning (ML) applications, geophysical modeling and big data.

Supermicro’s H13 Petascale storage server delivers up to 480 terabytes of high-performance storage via 16 hot-swap all-flash drives. The system is based on the Enterprise Data Center Standard Form Factor (EDSFF) E3 form factor NVMe storage to provide high-capacity scaling. The 2U Petascale version has double the storage bays and capacity.

Operating on the EDSFF standard also offers better performance with PCIe 5 connectivity and improved thermal efficiency.

Under the hood of this storage beast is a 4th generation AMD EPYC processor with up to 128 cores and 6TB of DDR5 memory. Combined with 128 lanes of PCIe 5 bandwidth, H13 delivers more than 200GB/sec. of bandwidth and more than 25 million input/output operations per second (IOPS).

Data’s Golden Age

Storing, sending and streaming massive amounts of data will continue to be a challenge for the media and entertainment industry.

Emerging formats will push the boundaries of resolution. New computer-aided graphics systems will become the industry standard. And consumers will continue to demand fully immersive AR and VR experiences.

Each of these evolutions will produce more and more data, forcing content creators to search for faster and more cost-effective storage methods.

Note: The media and entertainment industry will be the focus of a special session at the upcoming Supermicro Open Storage Summit ‘24, streaming live from Aug. 13 to Aug. 29. The M&E session, scheduled for Aug. 14 at 10 a.m. PDT / 1 p.m. EDT, will focus on AI and the future of media storage workflows. The speakers will represent Supermicro, AMD, Quantum and Western Digital. Learn more and register now to attend the 2024 Supermicro Open Storage Summit.

Do more:

 

Featured videos


Follow


Related Content

Why M&E content creators need high-end VDI, rendering & storage

Featured content

Why M&E content creators need high-end VDI, rendering & storage

Content creators in media and entertainment need lots of compute, storage and networking. Supermicro servers with AMD EPYC processors are enhancing the creativity of these content creators by offering improved rendering and high-speed storage. These systems empower the production of creative ideas.

 

Learn More about this topic
  • Applications:
  • Featured Technologies:

When content creators at media and entertainment (M&E) organizations create videos and films, they’re also competing for attention. And today that requires a lot of technology.

Making a full-length animated film involves no fewer than 14 complex steps, including 3D modeling, texturing, animating, visual effects and rendering. The whole process can take years. And it requires a serious quantity of high-end compute, storage and software.

From an IT perspective, three of the most compute-intensive activities for M&E content creators are VDI, rendering and storage. Let’s take a look at each.

* Virtual desktop infrastructure (VDI): While content creators work on personal workstations, they need the kind of processing power and storage capacity available from a rackmount server. That’s what they get with VDI.

VDI separates the desktop and associated software from the physical client device by hosting the desktop environment and applications on a central server. These assets are then delivered to the desktop workstation over a network.

To power VDI setups, Supermicro offers a 4U GPU server with up to 8 PCIe GPUs. The Supermicro AS -4125GS-TNRT server packs a pair of AMD EPYC 9004 processors, Nvidia RTX 6000 GPUs, and 6TB of DDR5 memory.

* Rendering: The last stage of film production, rendering is where the individual 3D images created on a computer are transformed into the stream of 2D images ready to be shown to audiences. This process, conducted pixel by pixel, is time-consuming and resource-hungry. It requires powerful servers, lots of storage capacity and fast networking.

For rendering, Supermicro offers its 2U Hyper system, the AS -2125HS-TNR. It’s configured with dual AMD EPYC 9004 processors, up to 6TB of memory, and your choice of NVMe, SATA or SAS storage.

* Storage: Content creation involves creating, storing and manipulating huge volumes of data. So the first requirement is simply having a great deal of storage capacity. But it’s also important to be able to retrieve and access that data quickly.

For these kinds of storage challenges, Supermicro offers Petascale storage servers based on AMD EPYC processors. They can pack up to 16 hot-swappable E3.S (7.5mm) NVMe drive bays. And they’ve been designed to store, process and move vast amounts of data.

M&E content creators are always looking to attract more attention. They’re getting help from today’s most advanced technology.

Do more:

 

 

Featured videos


Follow


Related Content

How ILM creates visual effects faster & cheaper with AMD-powered Supermicro hardware

Featured content

How ILM creates visual effects faster & cheaper with AMD-powered Supermicro hardware

ILM, the visual-effects company founded by George Lucas, is using AMD-powered Supermicro servers and workstations to create the next generation of special effects for movies and TV.

Learn More about this topic
  • Applications:
  • Featured Technologies:

AMD and Supermicro are helping Industrial Light & Magic (ILM) create the future of visual movie and TV production.

ILM is the visual-effects company founded by George Lucas in 1975. Today it’s still on the lookout for better, faster tech. And to get it, ILM leans on Supermicro for its rackmount servers and workstations, and AMD for its processors.

The servers help ILM reduce render times. And the workstations enable better collaboration and storage solutions that move data faster and more efficiently.

All that high-tech gear comes together to help ILM create some of the world’s most popular TV series and movies. That includes “Obi-Wan Kenobi,” “Transformers” and “The Book of Boba Fett.”

It’s a huge task. But hey, someone’s got to create all those new universes, right?

Power hungry—and proud of it

No one gobbles up compute power quite like ILM. Sure, it may have all started with George Lucas dropping an automotive spring on a concrete floor to create the sound of the first lightsaber. But these days, it’s all about the 1s and 0s—a lot of them.

An enormous amount of compute power goes into rendering computer-generated imagery (CGI) like special effects and alien characters. So much power, in fact, that it can take weeks or even months to render an entire movie’s worth of eye candy.

Rendering takes not only time, but also money and energy. Those are the three resources that production companies like ILM must ration. They’re under pressure to manage cash flow and keep to tight production schedules.

By deploying Supermicro’s high-performance and multinode servers powered by AMD’s EPYC processors , ILM gains high core counts and maximum throughput—two crucial components of faster rendering.

Modern filmmakers are also obliged to manage data. Storing and moving terabytes of rendering and composition information is a constant challenge, especially when you’re trying to do it quickly and securely.

The solution to this problem comes in the form of high-performance storage and networking devices. They can shift vast swaths of information from here to there without bottlenecks, overheating or (worst-case scenario) total failure.

EPYC stories

This is the part of the story where CPUs take back some of the spotlight. GPUs have been stealing the show ever since data scientists discovered that graphic processors are the keys to unlocking the power of AI. But producing the next chapter of the “Star Wars” franchise means playing by different rules.

AMD EPYC processors play a starring role in ILM’s render farms. Render farms are big collections of networked server-class computers that work as a team to crunch a metric ton of data.

A typical ILM render farm might contain dozens of high-performance computers like the Supermicro BigTwin. This dual-node processing behemoth can house two 3rd gen AMD EPYC processors, 4TB of DDR5 memory per node and a dozen 2.5-inch hot-swappable solid-state drives (SSDs). In case the specs don’t speak for themselves, that’s an insane amount of power and storage.

For ILM, lighting and rendering happen inside an application by Isotropix called Clarisse. Our hero, Clarisse, relies on CPU rather than GPU power. Unlike most 3D apps, which are single-threaded, Clarisse also features unusually efficient multi-threading.

This lets the application take advantage of the parallel-processing power in AMD’s EPYC CPUs to complete more tasks simultaneously. The results: shorter production times and lower costs.

Coming soon: StageCraft

ILM is taking its tech show on the road with an end-to-end virtual production solution called StageCraft. It exists as both a series of Los Angeles and Vancouver-based sites—ILM calls them “volumes”—as well as mobile pop-up volumes waiting to happen anywhere in the United States and Europe.

The introduction of StageCraft is interesting for a couple of reasons. For one, this new production environment makes ILM’s AMD-powered magic wand accessible to a wider range of directors, producers and studios.

For another, StageCraft could catalyze the proliferation of cutting-edge creative tech. This, in turn, could lead to the same kind of competition, efficiency increases and miniaturization that made 4K filmmaking a feature of everyone’s mobile phones.

StageCraft could also usher in a new visual language. The more people with access to high-tech visualization technology, the more likely it is that some unknown aspiring auteur will pop up, seemingly out of nowhere, to change the nature of entertainment forever.

Kinda’ like how George Lucas did it back in the day.

Do more:

 

 

Featured videos


Follow


Related Content

Tech Explainer: How does Gaming as a Service work?

Featured content

Tech Explainer: How does Gaming as a Service work?

Gaming as a Service is a streaming platform that pushes content from the cloud to personal devices on demand. Though it’s been around for years, in some ways it’s just getting started.

Learn More about this topic
  • Applications:
  • Featured Technologies:

The technology known as Gaming as a Service has been around for 20 years. But in many ways it’s just getting started.

The technology is already enjoyed by literally millions of gamers worldwide. But new advances in AI and edge computing are making a big difference. So are faster, more consistent internet connections.

And coming soon should be a mix of virtual and augmented reality (VR & AR) headsets. They could bring gaming to a whole new level.

But how does GaaS work? Let’s take a look.

Cloud + edge = GaaS

GaaS is to video games what Netflix is to movies. Like Netflix, GaaS is a streaming platform that pushes content from the cloud to PCs, smartphones and other personal devices (including gaming consoles with the appropriate updates) on demand.

GaaS originates in the cloud. There, data centers packed with powerful servers maintain the gaming environment, process user commands, determine interaction between players and the virtual world, and deliver real-time results to players.

If the cloud is GaaS’s brains, then edge computing networks are its arms. They reach out to a worldwide base of users, connecting their devices to the gaming cloud.

Edge devices also keep things speedy by amplifying or, if necessary, taking over various processing duties. This helps reduce latency, the time lag between when a command is issued and when it’s executed.

Latency is especially detrimental to gamers. They rely on split-second actions that can make the difference between winning and losing. For them, lower latency is always better.

Device choice

GaaS is innovative at the user end, too. GaaS can interface with a wide array of client devices. That offers gamers far more flexibility than they get with traditional gaming models.

With GaaS, users are no longer tied to a specific gaming PC or console such as the Microsoft Xbox or Sony PlayStation. Instead, gamers can use any supported device with a decent GPU and a stable internet connection speed of at least 10 to 15 Mbps.

To be sure, some GaaS games—one example is the super-popular Fortnite—require a mobile or desktop app. But these apps are usually free.

Other cloud-based games are designed to work with any standard web browser. This lets a gamer pick up wherever they left off, using nearly any internet-connected device anywhere in the world.

Big business

If all this sounds attractive, it is. One of the first GaaS titles, World of Warcraft, is still active nearly 20 years after its initial launch. In 2015—the last time its publisher, Blizzard Entertainment, reported usage numbers—World of Warcraft had 5.5 million players.

Even more popular is Fortnite, introduced in 2017. Today it has more than 350 million registered users. In part, that’s because of the game’s flexible business model: Fortnite players can sign up and enjoy basic gameplay for free.

Instead of charging these users a fee, Fortnite’s developer, Epic Games, makes money from literally millions of micro-transactions. These include in-game purchases of weapons and accessories, access to tournaments and other gated experiences, and the purchase of a new “season,” released four times a year.

Super-popular games like Fortnite and World of Warcraft have help create a lucrative and compelling business model. This, in turn, has given rise to a new breed of GaaS tech providers.

One such operation is Blacknut, a France-based cloud gaming platform. Together with Australian outfit Radian Arc, Blacknut provides a GaaS digital infrastructure powered by AMD-based GPU servers designed and distributed by Supermicro.

What could go wrong?

Does GaaS have a downside? Sure. No platform is without its flaws.

For one, cloud gamers are at the mercy of the cloud. If a cloud provider experiences a slowdown or outage, a game can disappear until the issue is resolved.

For another, unlike a collection of game titles on physical media, GaaS gamers never really own the games they play. For example, if Epic decided to shut down Fortnite tomorrow, 350+ million gamers would have no choice but to look for alternate entertainment.

Internet access can be an issue, too. Those of us in first-world cities tend to take our high-speed connections for granted. The rest of the world may not be so lucky.

Future of GaaS

Looking ahead, the future of GaaS appears bright.

Advances in AI-powered cloud and edge computing will encourage game developers to create more nuanced and immersive content than ever before.

Faster and more consistent internet connections will help. They’ll give more power to both the bandwidth-hungry devices we use today and the shiny, new objects of desire we’ll clamor for tomorrow.

Tomorrow’s devices will surely include a mixture of VR and AR headsets. These could attach to other smart devices that enhance gameplay, like the interactive bodysuits foretold by movies such as Ready Player One.

GaaS will get smaller, too, as new mobile devices come to the market. Cloud-gaming titles, already a mainstay of mobile gamers, should be further empowered by next-generation mobile processors and faster, more reliable wireless data connections like 5G.

We’re witnessing the evolution of gaming as multiple clients interact with low latencies and high-quality graphics. Welcome to the future.

 

Featured videos


Follow


Related Content

Gaming as a Service gets a platform boost

Featured content

Gaming as a Service gets a platform boost

Gaming as a Service gets a boost from Blacknut’s new platform for content providers that’s powered by Supermicro and Radian Arc.

Learn More about this topic
  • Applications:
  • Featured Technologies:

Getting into Gaming as a Service? Cloud gaming provider Blacknut has released a new platform for content providers that’s powered by Supermicro and Radian Arc.

This comprehensive edge and cloud architecture provides content providers worldwide with bundled and fully managed game licensing, in-depth content metadata and a global hybrid-cloud solution.

If you’re not into gaming yet, you might want to be. Interactive entertainment and game streaming are on the rise.

Last year, an estimated 30 million paying users spent a combined $2.4 billion on cloud gaming services, according to research firm Newzoo. Looking ahead, Newzoo expects this revenue to more than triple by 2025, topping $8 billion. That would make the GaaS market an attractive investment for content providers.

What’s more, studies show that Gen Z consumers (aged 11 to 26 years old) spend over 12 hours a week playing video games. That’s more time than they spend watching TV, by about 30 minutes a week.

Paradigm shift

This data could signal a paradigm shift that challenges the dominance of traditional digital entertainment. That could include subscription video on demand (SVOD) such as Netflix as well as content platforms including ISPs, device manufacturers and media companies.

To help content providers capture younger, more tech-savvy consumers, Blacknut, Supermicro and Radian Arc are lending their focus to deploying a fully integrated GaaS platform. Blacknut, based in France, offers cloud-based gaming. Australia-based Radian Arc provides digital infrastructure and cloud game technology.

The system offers IT hardware solutions at the edge and the core, system management software and extensive IP. Blacknut’s considerable collection includes a catalog of over 600 AAA to indie games.

Blacknut is also providing white-glove services that include:

  • Onboard games wish lists and help establishing exclusive publisher agreements
  • Support for Bring Your Own Game (BYOG) and freemium game models
  • Assistance with the development of IP-licensed games designed in partnership with specialized studios
  • Marketing support to help providers develop go-to-market plans and manage subscriber engagement

The tech behind GaaS

Providers of cloud-based content know all too well the challenge of providing customers with high-availability, low-latency service. The right technology is a carefully choreographed ballet of hybrid cloud infrastructure, modern edge architecture and the IT expertise required to make it all run smoothly.

At the edge, Blacknut’s GaaS offering operates on Radian Arc’s GPU Edge Infrastructure-as-a-Service platform powered by Supermicro GPU Edge Infrastructure solutions.

These hardware solutions include flexible GPU servers featuring 6 to 8 directly attached GPUs and AMD EPYC processors. Also on board are cloud-optimized, scalable management servers and feature-rich ToR networking switches.

Combined with Blacknut’s public and private cloud infrastructure, an impressive array of hardware and software solutions come together. These can create new ways for content providers to quickly roll out their own cloud-gaming products and capture additional market share.

Going global

The Blacknut GaaS platform is already live in 45 countries and is expanding via distribution partnerships with over-the-top providers and carriers.

The solution can also be pre-embedded in set-top boxes and TV ecosystems. Indeed, it has already found its way onto such marquis devices as Samsung Gaming Hub, LG Gaming Shelf and Amazon FireTV.

To learn more about the Blacknut GaaS platform powered by Radian Arc and Supermicro, check out this new solution brief:

 

Featured videos


Follow


Related Content

Enter Your Animation in Pixar’s RenderMan NASA Space Images Art Challenge

Featured content

Enter Your Animation in Pixar’s RenderMan NASA Space Images Art Challenge

For the first time, challengers can run their designs using thousands of AMD EPYC™ core CPUs, enabling artists to develop the most complex animations and the most amazing visualizations. “The contestants have access to this professional-grade render farm just like the pros. It levels the playing field,” said James Knight, the director of entertainment for AMD. “You can make scenes that weren’t possible before on your own PC,” he said.

Learn More about this topic
  • Applications:
  • Featured Technologies:
  • Featured Companies:
  • Pixar

One of the biggest uses of performance-intensive computing is the creation of high-resolution graphic animations used for entertainment and commercial applications. To that end, AMD and Pixar Animation Studios have announced the ninth RenderMan Art Challenge, which is open to the public. The idea is to encourage creative types to use some of the same tools that professional graphic designers and animators use to build something based on actual NASA data.

 

The winners will be determined by a set of Pixar, NASA and Industrial Light and Magic judges. The projects must be submitted by November 15 and the winning entries will be announced at the end of November.

 

This year’s challenge provides access to the AMD virtual Azure virtual machines, letting contestants use the highest-performing compute instances. Contestants will be given entrance to The AMD Creator Cloud, a render farm powered by Azure HBv3 composed of high-performance AMD EPYC™ processors using AMD 3D V-Cache™ technology.

 

For the first time, challengers can run their designs using thousands of AMD EPYC™ core CPUs, enabling artists to develop the most complex animations and the most amazing visualizations. “The contestants have access to this professional-grade render farm just like the pros. It levels the playing field,” said James Knight, the director of entertainment for AMD. “You can make scenes that weren’t possible before on your own PC,” he said.

 

The topic focus for this year’s challenge is space-related, in keeping with NASA’s involvement. The challenge provides scientifically accurate 3D NASA models, including telescopes, space stations, suits and planets. One of the potential advantages: many of past contests have ended up working at Pixar. “The RenderMan challenge gives everyone a chance to learn new things and show their abilities and creativity. The whole experience was great," said Khachik Astvatsatryan, a previous RenderMan Challenge winner.

 

Dylan Sisson, a RenderMan digital artist at Pixar, said “With the advancements we are seeing in hardware and software, individual artists are now able to create images of ever-increasing sophistication and complexity. It is a great opportunity for challengers to unleash their creative vision with these state-of-the-art technologies."

Featured videos


Follow


Related Content

Register to Watch Supermicro's Sweeping A+ Launch Event on Nov. 10

Featured content

Register to Watch Supermicro's Sweeping A+ Launch Event on Nov. 10

Join Supermicro online Nov. 10th to watch the unveiling of the company’s new A+ systems -- featuring next-generation AMD EPYC™ processors. They can't tell us any more right now. But you can register for a link to the event by scrolling down and signing-up on this page.
Learn More about this topic
  • Applications:
  • Featured Technologies:

Featured videos


Follow


Related Content

AMD’s Threadripper: Higher-Performance Computing from a Desktop Processor

Featured content

AMD’s Threadripper: Higher-Performance Computing from a Desktop Processor

The AMD Threadripper™ CPU may be a desktop processor, but desktop computing was never like this. The new chipset comes in a variety of multi-core versions, with a maximum of 64 cores running up to 128 threads, 256MB of L3 cache and 2TB of DDR 8-channel memory. The newest Threadrippers are built with AMD’s latest 7 nanometer dies.

 
Learn More about this topic
  • Applications:
  • Featured Technologies:
  • Featured Companies:
  • Velocity Micro

Content creators, designers, video animators and digital FX experts make much higher demands of their digital workstations than typical PC users. These disciplines often make use of heavily threaded applications such as Adobe After Effects, Unreal Engine or CAD apps such as Autodesk. What is needed is a corresponding increase in computing power to handle these applications.

 

That’s where one solution comes in handy for this type of power user: the AMD Ryzen Threadripper™ CPU, which now has a PRO 5000 update. One advantage of these newer chipsets is that they can fit on the same WRX80 motherboards that supported the earlier Threadripper series. There are other configurations, including the ProMagix HD150 workstation sold by Velocity Micro. The solution provider is looking at testing overclocking on both the MSI and Asrock motherboards that they will include in their HD150 workstations. That’s right, a chip that’s designed from the get-go to be overclocked. Benchmarks using sample apps (mentioned above) ran about twice as fast as on competitors’ less-capable hardware. (Supermicro offers the MI2SWA-TF motherboard with the Threadripper chipset.)

 

Desktop Was Never Like This

 

The AMD Threadripper™ CPU may be a desktop processor, but desktop computing was never like this. The new chipset comes in a variety of multi-core versions, with a maximum of 64 cores running up to 128 threads, 256MB of L3 cache and 2TB of DDR 8-channel memory. The newest Threadrippers are built with AMD’s latest 7 nanometer dies.

 

The Threadripper CPUs are not just fast but come with several built-in security features, including support for Zen 3 and Shadow Stack. Zen 3 is the overall name for a series of improvements to the AMD higher-end CPU line that have shown a 19% improvement in instructions per clock. And they have lower latency for double the cache delivery when compared to the earlier Zen 2 architecture chips.

 

These processors also support Microsoft’s Hardware-enforced Stack Protection to help detect and thwart control-flow attacks by checking the normal program stack against a secured hardware-stored copy. This helps to boot securely, protect the computer from firmware vulnerabilities, shield the operating system from attacks, and prevent unauthorized access to devices and data with advanced access controls and authentication systems.

Featured videos


Follow


Related Content

Innovations from Supermicro and AMD Help Create Visual Effects for Blur Studio

Featured content

Innovations from Supermicro and AMD Help Create Visual Effects for Blur Studio

Blur Studio calculated it could replace a competitor's 500-node server farm with just 56 Supermicro A+ servers equipped with AMD EPYC™ CPUs, getting equivalent processing power.

Learn More about this topic
  • Applications:
  • Featured Technologies:

The latest computer graphics images in movies and TV require the latest computing innovations. The scenes are getting more realistic, and that means taking advantage of Supermicro A+ computers using AMD EPYC™ 7742 CPUs with 64 cores, 129 threads and loads of DDR4 memory. “These have the necessary horsepower to render the visual effects,” said Shawn Wallbridge, the head of IT for Blur Studio.

 

Blur is a major animation and visual effects house begun by Tim Miller, the director of “Deadpool.” The studio has produced game cinematics, commercials and complex visuals such as scenes for the latest Halo Wars, League of Legion and “Terminator: Dark Fate.”

 

Animation can benefit from AMD’s advanced CPUs with higher core densities and clock speeds, supporting higher frame rates and scene interactions.

 

Blur originally used a 500-node server farm with a competitor’s CPUs. It switched to the AMD EPYC™ processors when it had to work on three very demanding films concurrently. Rendering times that previously would have taken 75 hours to complete took only 10 hours with the AMD EPYC™ CPU-powered computers. There were also significant workflow improvements because the graphic artists could see the results overnight rather than having to wait days. Blur was able to create more complex action scenes that were both frenetic and highly believable to audiences.

 

The studio calculated it could provide the equivalent processing power by replacing its 500-node server farm with just 56 Supermicro A+ servers equipped with AMD EPYC™ CPUs. Additional advantages included lower software licensing fees, reduced power consumption and lower cooling expenses.

 

“Considering the CPU marketplace right now, there is just no competition. It’s just mind blowing how fast the effects are,” said Blur's Wallbridge.

 

For more information about this story, see the AMD case study on Blur Studio.

 

Featured videos


Follow


Related Content