Sponsored by:

Visit AMD Visit Supermicro

Performance Intensive Computing

Capture the full potential of IT

Where Are Blockchain and Web3 Taking Us? — Part 2: Delving Deeper into Blockchain

Featured content

Where Are Blockchain and Web3 Taking Us? — Part 2: Delving Deeper into Blockchain

This is the second in a four-part series on blockchain’s many facets, including being the primary pillar of the emerging Web3.

Learn More about this topic
  • Applications:

Part 1: First There Was Blockchain  |  Part 3: Web3 Emerging  |  Part 4: The Web3 and Blockchain FAQ

To get a sound understanding of blockchain, you should be aware of some of the nagging issues and criticisms. For example, blockchain has no governance. It could really use the guidance of a small representative group of industry visionaries to help it chart a course, but that might lead to a more centralized orientation. You should also familiarize yourself with the related tools and technologies and what they do. NFTs, in particular, work hand in hand with blockchain and add protection for those who create.

 

Getting NFTs

 

It has been effectively open season on digital content on the internet from the get-go. DRM technology didn’t solve the problem. Will the non-fungible token (NFT) make inroads? Its long-term success or lack thereof will largely be dependent on the success of blockchain. Make no mistake, blockchain is here to stay. It’s too useful a tool to leave behind. But Web3’s premise — that blockchain-based servers might someday run the internet — is by no means certain. (Come back for Part 3 which explores Web3.)

 

What are NFTs? “NFTs facilitate non-fraudulent trade for digital asset producers and consumers or collectors,” said Eric Frazier, senior solutions manager, Supermicro.

 

An NFT is a digital asset authentication system located on a blockchain that gives the holder proof of ownership of digital creations. It does this via metadata that make each NFT unique. Plus, no two people can own the same NFT, which also can’t be changed or destroyed.

 

Applications include digital artwork, but an NFT (sometimes called a "nifty") can be used for a wide variety of uses in music, gaming, entertainment, popular culture items (such as sports merchandise), virtual real estate, prevention of counterfeit products, domain name provenance and others. Down the road, NFTs may have a significant effect on software licensing, intellectual property rights and copyright. Land registry, birth and death certificates, and many other types of records are also potential future beneficiaries of NFTs.

 

If you’re wondering whether NFTs can be traded for cryptocurrency, they can be. What they are not is interchangeable. You may have an NFT for a piece of art that was sold as multiple copies by its owner. But each of those NFTs has unique meta data, so they may not be exchanged one for the other.

 

Smart Contracts Execute

 

A smart contract is blockchain-based, self-executing contract containing code that runs automatically when predetermined conditions are met as set out in an agreement or transaction. So, a hypothetical example might be: on January 15, transfer X value of cryptocurrency in payment for a specific NFT owned by a specific person. Smart contracts are autonomous, trustless, traceable, transparent and irreversible. Key hallmarks of the Smart Contract are that they exclude intermediaries and third parties like lawyers and notaries. And they usually use simple language, require fewer steps and involve less paperwork.

 

Blockchain Power Consumption

 

Some blockchains gobble up electricity and are heavy users of compute and storage resources. But blockchains are not all created equally. Bitcoin is known to be resource in hungry, while “Filecoin’s needs are materially less,” said to Michael Fair, chief revenue officer and longtime channel expert, PiKNiK.

 

It’s also possible to make changes to some blockchains to make them less power hungry. For example, Ethereum switched from the Proof-of-Work (PoW) to the Proof-of-Stake (PoS) algorithm a few months ago, which reduced power consumption by over 99%. However, Ethereum is less decentralized as a result because it is now 80% hosted on AWS. (See the discussion on Understanding Decentralized in Part 1.)

 

“With the algorithm switch from PoW to PoS, Ethereum’s decentralization took a big hit because the majority of transactions and validations are running on Amazon’s cloud” said Jörg Roskowetz, director of blockchain technology, AMD. “From my point of view, hybrid systems like Lightning on the Bitcoin network will keep all the parameters improving — scalability, latency and power-consumption challenges. This will likely take years to be developed and improved.

 

Can Web3 Remain Decentralized?

 

Is the blockchain movement viable going forward? There are those who are skeptical: For example, Scott Nover writing in Quartz and Moxie Marlinspike. Both stories were published almost a year ago in January 2022, well before the change at Ethereum.

 

Nover writes: “Even if blockchains are decentralized, the Web3 services that interact with them are controlled by a very small number of privately held companies. In fact, the industry emerging to support the decentralized web is highly consolidated, potentially undermining the promise of Web3.”

 

These are real concerns. But it’s not like the expectation was that Web3 would exist in a world free of potentially undermining factors, including the consolidation of Web3 blockchain companies as well as some interaction with Web 2.0 companies. If Web3 succeeds, it will need to support a good user experience and be resilient enough to develop additional ways of shielding tself from centralizing influences. It’s not going to exist in a vacuum.

 

 

Other Stories in this Series:

Part 1: First There Was Blockchain

Part 2: Delving Deeper into Blockchain

Part 3: Web3 Emerging

Part 4: The Web3 and Blockchain FAQ

 

Featured videos


Follow


Related Content

Where Are Blockchain and Web3 Taking Us? — Part 1: First There Was Blockchain

Featured content

Where Are Blockchain and Web3 Taking Us? — Part 1: First There Was Blockchain

This is the first story in a four-part series on blockchain’s many facets, including being the primary pillar of the emerging Web3. 

Learn More about this topic
  • Applications:

 |  Part 2: Delving Deeper into Blockchain  |  Part 3: Web3 Emerging  |  Part 4: The Web3 and Blockchain FAQ

There has been a lot of buzz about blockchain over the past five years, and yet seemingly not much movement. Long, long ago I concluded that the amount of truth to the reported value of a new technology was inversely proportional to the level of din its hype made. But as with so much else about blockchain, it defies conventional wisdom. Blockchain is a bigger deal than is generally realized.

 

Basic Blockchain Definition and Introduction

 

(Source: Wikipedia): Blockchain is a peer-to-peer (P2P) or publicly decentralized ledger (shared distributed database) that consists of blocks of data bound together with cryptography. Each block contains a cryptographic hash of the previous block, a time stamp and a transaction date. Because each block contains information from the previous block, they effectively form a chain – hence the name blockchain.

 

Blockchain transactions resist being altered once they are recorded because the data in any given block cannot be altered retroactively without altering all subsequent blocks that duplicate that data. As a P2P publicly distributed ledger, nodes collectively adhere to a consensus algorithm protocol to add and validate new transaction blocks.

 

“A blockchain is a system of recording information in a way that makes it difficult or impossible to change, cheat or hack the system,” said Eric Frazier, senior solutions manager, Supermicro. “It is a digital ledger that is duplicated and distributed to a network of multiple nodes on the blockchain.”

 

Michael Fair, PiKNiK’s chief revenue officer and longtime channel expert added, “In the blockchain, data is immutable. It’s actually sealed within the network, which is monitored by the blockchain 24 x 7 x 365 days a year.”

 

Blockchain was created in 2008 under the apparent pseudonym, Satoshi Nakamoto. Its original use was to provide a public distributed ledger for the bitcoin cryptocurrency also created by the same entity. But the true promise of blockchain goes way beyond cryptocurrency. The downside is that blockchain operations are computationally intensive and tend to use lots of power. This issue will be covered in more detail later in the series.

 

Understanding “Decentralized”

 

The term decentralized is probably the most important tenet of Web3 and it is at least partially delivered by blockchain. The word has a specific set of meanings, although it’s become something of a buzzword, which tends to blur its meaning.

 

Gavin Wood is an Ethereum Cofounder, Polkadot founder and the person who coined the term Web3 in 2014. Based on comments made by Wood in a January 2022 YouTube video by CNBC International, as well as other sources, decentralized means that no one company’s servers exclusively own a crucial part of the internet. There are two related meanings for decentralized that get confused sometimes:

 

1. In its most basic form, decentralized is about keeping data safe from monopolization by using blockchain and other technologies to make data and content independent. Data in a blockchain is copied to servers all over the world, which cannot change that information unilaterally. There’s no one place that this data exists and that protects it. Blockchain makes it immutable.

 

2. Decentralized also means what Wood called “political decentralization,” wherein “no one will have the power to turn off content,” the way top execs could (in theory) at companies like Google, Facebook, Amazon, Microsoft and Twitter. Decentralization could potentially kick these and other companies out of the “Your Data” business. A key phrase that relates to this meaning of the term is highly consolidated. How many companies have Google, Amazon, Microsoft, and Facebook purchased over the past couple of decades? Google purchased YouTube. Facebook bought Instagram. Microsoft nabbed LinkedIn. But that’s just the tip of the iceberg. Where once there were many companies, now there are a few, very large companies exerting control over the internet. That’s what highly consolidated refers to. It’s term that’s often used to describe the opposite of decentralized.

 

Blockchain Uses

 

Since 2019 or so, new ideas for blockchain applications have arrived fast and furiously. And while many are plausible theories, others have been actively produced. If your company’s sector of the marketplace happens to be one of the areas that blockchain has been identified with, chances are good that blockchain is at least on your company’s radar.

 

Many organizations are looking to blockchain to rejuvenate their product pipelines. The future of blockchain will very likely be determined by technocrats and developers who harness it to chase profits. In other words, thousands of enterprises are developing blockchain products and services to their own needs, and if they succeed, many others will likely follow.

 

Beyond supporting cryptocurrency, three early uses of blockchain have been:

  • Financial services
  • Government use of blockchain for voting
  • Helping to keep track of supply chains. There’s a synergy in the way they work that makes blockchain and supply chain ideal for one another.

Blockchain has quickly spread to several areas of financial services like tokenizing assets and fiat currencies, P2P lending backed by assets, decentralized finance (DeFi) and self-enforcing smart contracts to name a few.

 

Blockchain voting could help put a stop to the corruption surrounding elections. Countries like Sierra Leone and Russia were early to it. But several other countries have tried it – including the U.S.

 

In healthcare, a handful of companies are attempting to revolutionize e-records by developing them on blockchain-based decentralized ledgers instead of stored away in some company’s database. The medical community is looking at it to store DNA information.

 

Storage systems are an early and important blockchain application. Companies like PiKNiK offer decentralized blockchain storage on a BTB basis.

 

Other Stories in this Series:

Part 1: First There Was Blockchain

Part 2: Delving Deeper into Blockchain

Part 3: Web3 Emerging

Part 4: The Web3 and Blockchain FAQ

 

Featured videos


Follow


Related Content

Some Key Drivers behind AMD’s Plans for Future EPYC™ CPUs

Featured content

Some Key Drivers behind AMD’s Plans for Future EPYC™ CPUs

A video discussion between Charles Liang, Supermicro CEO, and Dr. Lisa Su, AMD CEO.

 

Learn More about this topic
  • Applications:
  • Featured Technologies:

Higher clock rates, more cores and larger onboard memory caches are some of the traditional areas of improvement for generational CPU upgrades. Performance improvements are almost a given with a new generation CPU. Increasingly, howeer, the more difficult challenges for data centers and performance-intensive computing are energy efficiency and managing heat. Energy costs have spiked in many parts of the world and “performance per watt” is what many companies are looking for. AMD’s 4th-gen EPYC™ CPU runs a little hotter than its predecessor, but its performance gains far outpace the thermal rise, making for much greater performance per watt. It’s a trade-off that makes sense, especially for performance-intensive computing, such HPC and technical computing applications.

In addition to the energy efficiency and heat dissipation concerns, Dr. Su and Mr. Liang discuss the importance of the AMD EPYC™ roadmap. You’ll learn one or two nuances about AMD’s plans. SMC is ready with 15 products that leverage the Genoa, AMD’s fourth generation EPYC™ CPU. This under 15-minute video recorded on November 15, 2022, will bring you up to date on all things AMD EPYC™. Click the link to see the video:

Supermicro & AMD CEOs Video – The Future of Data Center Computing

 

 

 

 

Featured videos


Follow


Related Content

Supermicro H13 Servers Maximize Your High-Performance Data Center

Featured content

Supermicro H13 Servers Maximize Your High-Performance Data Center

Learn More about this topic
  • Applications:
  • Featured Technologies:
  • Featured Companies:
  • AMD

The modern data center must be both highly performant and energy efficient. Massive amounts of data are generated at the edge and then analyzed in the data center. New CPU technologies are constantly being developed that can analyze data, determine the best course of action, and speed up the time to understand the world around us and make better decisions.

With the digital transformation continuing, a wide range of data acquisition, storage and computing systems continue to evolve with each generation of  a CPU. The latest CPU generations continue to innovate within their core computational units and in the technology to communicate with memory, storage devices, networking and accelerators.

Servers and, by default, the CPUs within those servers, form a continuum of computing and I/O power. The combination of cores, clock rates, memory access, path width and performance contribute to specific servers for workloads. In addition, the server that houses the CPUs may take different form factors and be used when the environment where the server is placed has airflow or power restrictions. The key for a server manufacturer to be able to address a wide range of applications is to use a building block approach to designing new systems. In this way, a range of systems can be simultaneously released in many form factors, each tailored to the operating environment.

The new H13 Supermicro product line, based on 4th Generation AMD EPYC™ CPUs, supports a broad spectrum of workloads and excels at helping a business achieve its goals.

Get speeds, feeds and other specs on Supermicro’s latest line-up of servers

Featured videos


Follow


Related Content

AMD Announces Fourth-Generation EPYC™ CPUs with the 9004 Series Processors

Featured content

AMD Announces Fourth-Generation EPYC™ CPUs with the 9004 Series Processors

AMD announces its fourth-generation EPYC™ CPUs. The new EPYC 9004 Series processors demonstrate advances in hybrid, multi-die architecture by decoupling core and I/O processes. Part 1 of 4.

Learn More about this topic
  • Applications:
  • Featured Technologies:
AMD very recently announced its fourth-generation EPYC™ CPUs.This generation will provide innovative solutions that can satisfy the most demanding performance-intensive computing requirements for cloud computing, AI and highly parallelized data analytic applications. The design decisions AMD made on this processor generation strirke a good balance among specificaitons, including higher CPU power and I/O performance, latency reductions and improvements in overall data throughput. This lets a single CPU socket address an increasingly larger world of complex workloads. 
 
The new AMD EPYC™ 9004 Series processors demonstrate advances in hybrid, multi-die architecture by decoupling core and I/O processes. The new chip dies support 12 DDR5 memory channels, doubling the I/O throughput of previous generations. The new CPUs also increase core counts from 64 cores in the previous EPYC 7003 chips to 96 cores in the new chips using 5-nanometer processes. The new generation of chips also increases the maximum memory capacity from 4TB of DDR4-3200 to 6TB of DDR5-4800 memory.
 
 
 
There are three major innovations evident in the AMD EPYC™ 9004 processor series:
  1. A  new hybrid multi-die chip architecture coupled with multi-processor server innovations and a new and more advanced Zen 4 instruction set along with support for an increase in dedicated L2 and shared L3 cache storage
  2. Security enhancements to AMD’s Infinity Guard
  3. Advances to system-on-chip designs that extend and enhance AMD Infinity switching fabric technology,
Taken together, the new AMD EPYC™ 9004 series processors can offer plenty of innovation and performance advantage. The new processors offer better performance per watt of power consumed and better per core performance, too.
 

Featured videos


Follow


Related Content

Unlocking the Value of the Cloud for Mid-size Enterprises

Featured content

Unlocking the Value of the Cloud for Mid-size Enterprises

Learn More about this topic
  • Applications:
  • Featured Technologies:
  • Featured Companies:
  • Microsoft Azure

Organizations around the world are requiring new options for their next-generation computing environments. Mid-size organizations, in particular, are facing increasing pressure to deliver cost-effective, high-performance solutions within their hyperconverged infrastructures (HCI). Recent collaboration between Supermicro, Microsoft Azure and AMD, leveraging their collective technologies, has created a fresh approach that lets enterprises maintain performance at a lower operational cost while helping to reduce the organization’s carbon footprint in support of sustainability initiatives. This cost-effective, 1U system (a 2U version is available) offers both power, flexibility and modularity in large-scale GPU deployments.

The results of the collaboration combine the latest technologies, supporting multiple CPU, GPU, storage and networking options optimized to deliver uniquely configured and highly scalable systems. The product can be optimized for SQL and Oracle databases, VDI, productivity applications and database analytics. This white paper explores why this universal GPU architecture is an intriguing and cost-effective option for CTOs and IT administrators who are planning to rapidly implement hybrid cloud, data center modernization, branch office/edge networking or Kubernetes deployments at scale.

Get the 7-page white paper that provides the detail to assess the solution for yourself, including the new Azure Stack HCI certified system, specifications, cost justification and more.

 

Featured videos


Follow


Related Content

Enter Your Animation in Pixar’s RenderMan NASA Space Images Art Challenge

Featured content

Enter Your Animation in Pixar’s RenderMan NASA Space Images Art Challenge

For the first time, challengers can run their designs using thousands of AMD EPYC™ core CPUs, enabling artists to develop the most complex animations and the most amazing visualizations. “The contestants have access to this professional-grade render farm just like the pros. It levels the playing field,” said James Knight, the director of entertainment for AMD. “You can make scenes that weren’t possible before on your own PC,” he said.

Learn More about this topic
  • Applications:
  • Featured Technologies:
  • Featured Companies:
  • Pixar

One of the biggest uses of performance-intensive computing is the creation of high-resolution graphic animations used for entertainment and commercial applications. To that end, AMD and Pixar Animation Studios have announced the ninth RenderMan Art Challenge, which is open to the public. The idea is to encourage creative types to use some of the same tools that professional graphic designers and animators use to build something based on actual NASA data.

 

The winners will be determined by a set of Pixar, NASA and Industrial Light and Magic judges. The projects must be submitted by November 15 and the winning entries will be announced at the end of November.

 

This year’s challenge provides access to the AMD virtual Azure virtual machines, letting contestants use the highest-performing compute instances. Contestants will be given entrance to The AMD Creator Cloud, a render farm powered by Azure HBv3 composed of high-performance AMD EPYC™ processors using AMD 3D V-Cache™ technology.

 

For the first time, challengers can run their designs using thousands of AMD EPYC™ core CPUs, enabling artists to develop the most complex animations and the most amazing visualizations. “The contestants have access to this professional-grade render farm just like the pros. It levels the playing field,” said James Knight, the director of entertainment for AMD. “You can make scenes that weren’t possible before on your own PC,” he said.

 

The topic focus for this year’s challenge is space-related, in keeping with NASA’s involvement. The challenge provides scientifically accurate 3D NASA models, including telescopes, space stations, suits and planets. One of the potential advantages: many of past contests have ended up working at Pixar. “The RenderMan challenge gives everyone a chance to learn new things and show their abilities and creativity. The whole experience was great," said Khachik Astvatsatryan, a previous RenderMan Challenge winner.

 

Dylan Sisson, a RenderMan digital artist at Pixar, said “With the advancements we are seeing in hardware and software, individual artists are now able to create images of ever-increasing sophistication and complexity. It is a great opportunity for challengers to unleash their creative vision with these state-of-the-art technologies."

Featured videos


Follow


Related Content

Register to Watch Supermicro's Sweeping A+ Launch Event on Nov. 10

Featured content

Register to Watch Supermicro's Sweeping A+ Launch Event on Nov. 10

Join Supermicro online Nov. 10th to watch the unveiling of the company’s new A+ systems -- featuring next-generation AMD EPYC™ processors. They can't tell us any more right now. But you can register for a link to the event by scrolling down and signing-up on this page.
Learn More about this topic
  • Applications:
  • Featured Technologies:

Featured videos


Follow


Related Content

Energy-Efficient AMD EPYC™ Processors Bring Significant Savings

Featured content

Energy-Efficient AMD EPYC™ Processors Bring Significant Savings

Cut electricity consumption by up to half with AMD's power-saviing EPYC™ processors.

Learn More about this topic
  • Applications:
  • Featured Technologies:
  • Featured Companies:
  • Ateme, DBS, Nokia

Nokia was able to target up to a 40% reduction in server power consumption using EPYC. DBS and Ateme each experienced a 50% drop in energy costs. AMD’s EPYC™ processors can provide big energy-saving benefits, so you can meet your most demanding application performance requirements and still provide planetary and environmental efficiencies.

For example: To provide a collection of 1,200 virtual machines, AMD would require 10 servers compared to 15 for those built using equivalent Intel CPUs. This translates into a 41% lower total cost of ownership over a three-year period, with a third less energy consumption, saving on carbon emissions too. For deep detail and links to case studies by the companies mentioned above. Find out how they  saved significantly on energy-costs while reducing their carbon footprints, check out the infographic.

 

Featured videos


Follow


Related Content

The Perfect Combination: The Weka Next-Gen File System, Supermicro A+ Servers and AMD EPYC™ CPUs

Featured content

The Perfect Combination: The Weka Next-Gen File System, Supermicro A+ Servers and AMD EPYC™ CPUs

Weka’s file system, WekaFS, unifies your entire data lake into a shared global namespace where you can more easily access and manage trillions of files stored in multiple locations from one directory.

Learn More about this topic
  • Applications:
  • Featured Technologies:
  • Featured Companies:
  • Weka.io

One of the challenges of building machine learning (ML) models is managing data. Your infrastructure must be able to process very large data sets rapidly as well as ingest both structured and unstructured data from a wide variety of sources.

 

That kind of data is typically generated in performance-intensive computing areas like GPU-accelerated applications, structural biology and digital simulations. Such applications typically have three problems: how to efficiently fill a data pipeline, how to easily integrate data across systems and how to manage rapid changes in data storage requirements. That’s where Weka.io comes into play, providing higher-speed data ingestion and avoiding unnecessary copies of your data while making it available across the entire ML modeling space.

 

Weka’s file system, WekaFS, has been developed just for this purpose. It unifies your entire data lake into a shared global namespace where you can more easily access and manage trillions of files stored in multiple locations from one directory. It works across both on-premises and cloud storage repositories and is optimized for cloud-intensive storage so that it will provide the lowest possible network latencies and highest performance.

 

This next-generation data storage file system has several other advantages: it is easy to deploy, entirely software-based, plus it is a storage solution that provides all-flash level performance, NAS simplicity and manageability, cloud scalability and breakthrough economics. It was designed to run on any standard x86-based server hardware and commodity SSDs or run natively in the public cloud, such as AWS.

 

Weka’s file system is designed to scale to hundreds of petabytes, thousands of compute instances and billions of files. Read and write latency for file operations against active data is as low as 200 microseconds in some instances.

 

Supermicro has produced its own NVMe Reference Architecture that supports WekaFS on some of its servers, including the Supermicro A+ AS-1114S-WN10RT and AS-2114S-WN24RT using the AMD EPYC™ 7402P processors with at least 2TB of memory, expandable to 4TB. Both servers support hot-swappable NVMe storage modules for ultimate performance. Also check out the Supermicro WekaFS A/I and HPC Solution Bundle.

 

 

Featured videos


Follow


Related Content

Pages