Monday, July 8th 2024

AMD is Becoming a Software Company. Here's the Plan

Just a few weeks ago, AMD invited us to Barcelona as part of a roundtable, to share their vision for the future of the company, and to get our feedback. On site, were prominent AMD leadership, including Phil Guido, Executive Vice President & Chief Commercial Officer and Jack Huynh, Senior VP & GM, Computing and Graphics Business Group. AMD is making changes in a big way to how they are approaching technology, shifting their focus from hardware development to emphasizing software, APIs, and AI experiences. Software is no longer just a complement to hardware; it's the core of modern technological ecosystems, and AMD is finally aligning its strategy accordingly.

The major difference between AMD and NVIDIA is that AMD is a hardware company that makes software on the side to support its hardware; while NVIDIA is a software company that designs hardware on the side to accelerate its software. This is about to change, as AMD is making a pivot toward software. They believe that they now have the full stack of computing hardware—all the way from CPUs, to AI accelerators, to GPUs, to FPGAs, to data-processing and even server architecture. The only frontier left for AMD is software.
Fast Forward to Barcelona
We walked into the room in Barcelona expecting the usual fluff talk about how AI PC is the next big thing, and how it's all hands on deck to capture market share—we've heard that before, from pretty much everyone at Computex Taiwan. Well, we did get a substantial talk on how AMD's new Ryzen AI 300 series "Strix Point" processors are the tip of the spear for the company with AI PCs, and how it thinks that it brings a winning combination of hardware to see it through; but what we didn't expect was to get a glimpse into how much the company AMD is changing things around to gain competitiveness in this new world, which led to a stunning disclosure by the company.

AMD has "tripled our software engineering, and are going all-in on the software." This not only means bring in more people, but also allow people to change roles: "we moved some of our best people in the organization to support" these teams. When this transformation is completed, the company will more closely resemble contemporaries in the industry such as Intel and NVIDIA. AMD commented that in the past they were "silicon first, then we thought about SDKs, toolchains and then the ISVs (software development companies)." They continued "Our shift in strategy is to talk to ISVs first...to understand what the developers want enabled." which is a fundamental change to how new processors are created. I really like this quote: "the old AMD would just chase speeds and feeds. The new AMD is going to be AI software first, we know how to do silicon"—and I agree that this is the right path forward.

AMD of the Old: Why Hardware-first is Bad for a Hardware Company in IT
AMD's hardware-first approach to tech has met with limited market success. Despite having a CPU microarchitecture that at least matches Intel, the company barely commands a quarter of the market (both server and client processors combined); and despite its gaming GPUs being contemporary, it barely has a sixth of this market. This is not for a lack of performance—AMD makes some very powerful CPUs and GPUs, which are able to keep competitors on their toes. The number-one problem with AMD's technology has been relatively less engagement with the software vendor ecosystem—to make best use of the hardware's exclusive and unique capabilities through first-party software technologies—APIs, developer tools, resources, developer networking, and optimization.

For example, Radeon GPUs have had tessellation capabilities at least two generations ahead of NVIDIA, which was only exploited by developers after Microsoft standardized it in the DirectX 11 API, the same happened with Mantle and DirectX 12. In both cases, the X-factor NVIDIA enjoys is a software-first approach, the way it engages with developers, and more importantly, the install-base (over 75% of the discrete GPU market-share). There have been several such examples of AMD silicon packing exotic accelerators across its hardware stack that haven't been properly exploited by the software community. The reasons are usually the same—AMD has been a hardware-first company.

Why is Tesla a hotter stock than General Motors? Because General Motors is an automobile company that happens to use some technology in its vehicles; whereas Tesla is a tech company that happens to know automobile engineering. Tesla vehicles are software-defined devices that can transport you around. Tesla's approach to transportation has been to understand what consumers want or might need from a technology standpoint, and then building the hardware to achieve it. In the end, you know Tesla for its savvy cars, much in the same way that you know NVIDIA for GPUs that "just work," and like Tesla, NVIDIA's revenues are overwhelmingly made up of hardware sales—despite them being software-first, or experience-first. Another example of exactly this would be Apple who have built a huge ecosystem of software and services that is designed to work extremely well together, but also locks people in their "walled garden," enabling huge profits for the company in the process.

NVIDIA's Weakness
This is not to say that AMD has neglected software at all—far from it, the company has played nice-guy by keeping much of its software base open-source, through initiatives such as GPUOpen and ROCm, which are great resources for software developers, and we definitely love the support for open source. It's just that AMD has not treated software as its main product, that makes people buy their hardware and bring in the revenues. AMD is aware of this and wants "to create a unified architecture across our CPU and RDNA, which will let us simplify [the] software." This looks like an approach similar to Intel's OneAPI, which makes a lot of sense, but it will be a challenging project. NVIDIA's advantage here is that they have just one kind of accelerator—the GPU, which runs CUDA—a single API for all developers to learn, which enables them to solve a huge range of computing challenges on hardware ranging from $200 to $30,000.

On the other hand, this is also a weakness of NVIDIA, and an advantage for AMD. AMD has a rich IP portfolio of compute solutions, ranging from classic CPUs and GPUs, to XDNA FPGA chips (through the Xilinx acquisition), now they just need to bring them together, exposing a unified computing interface that makes it easy to strategically shift workloads between these core types, to maximize performance, cost, efficiency or both. Such a capability would give the company the ability to sell customers a single-product combined accelerator system comprised of components like a CPU, GPU and specialized FPGA(s)—similar to how you're buying an iPhone and not a screen, processor, 5G modem and battery to combine them on your own.

Enabling Software Developers
If you've sat through NVIDIA's GTC sessions like we have, barely 5-10% of the showtime is spent talking about NVIDIA hardware (their latest AI GPUs or accelerators up and down the stack), most of the talk is about first-party software solutions—problems to solve, solutions, software, API, developer tools, collaboration tools, bare-metal system software, and only then the hardware. AMD started its journey toward exactly this.

They are now talking to the major software companies, like Microsoft, Adobe and OpenAI, to learn what their plans are and what they need from a future hardware generation. AMD's roadmaps now show the company's plans several years into the future, so that their partners can learn what AMD is creating, so the software products can better utilize these new features.

Market Research
We got a detailed presentation from market research firm IDC, which AMD contracted to study the short- and medium-term future of AI PCs, and the notion that PCs with native acceleration will bring a disruptive change to computing. This happened before, when bricks became iPhones, when networks became the Internet, and when text-based prompts were banished for GUI interfaces. To be honest, generative AI has taken a life of its own, and is playing a crucial role in the mainstreaming of this new tech, but the current implementation relies on cloud-based acceleration. Running everything in the cloud comes with huge power usage and expensive NVIDIA GPUs are used in the process. Are people willing to buy a whole new device just to get some of this acceleration onto their devices for privacy and latency? This remains to be seen. Even with 50 TOPS, the NPU of AMD "Strix" and Intel "Lunar Lake" won't exactly zip through image generation, but make text-based LLMs viable, as would certain audiovisual effects such as Teams webcam background replacements, noise suppression, and even live translation.

AMD is aware of the challenges, especially after Intel (Meteor Lake) and Microsoft (Copilot) spammed us with "AI" everywhere, and huge chunks of the userbase fail to see the convincing arguments. Privacy and Security are on AMD's radar, and you need to "demonstrate that you actually increase productivity per workload. If you're asking people to spend more money [... you need to prove that] you can save hours per week....that could be worth the investment, [but] will require a massive education of the end-users." There is also a goal to give special love to "build the most innovative and disruptive form factors" for notebooks, so that people are like "wow, there's something new here". Specifically in the laptop space they are watching Qualcomm's Windows on Arm initiative very closely and want to make sure to "launch a product only when it's ready," and to also "address price-points below $1000."

Where Does AMD Begin in 2024?
What's the first stop in AMD's journey? It's to ensure that it's able to grow its market-share both on the client side with AI PCs, and on the data-center side, with its AI GPUs. For AI PCs, the company believes it has a winning product with the Ryzen AI 300 series "Strix Point" mobile processors, which it thinks are in a good position to ramp through 2024. What definitely helps is the fact that "Strix Point" is based on a relatively mature TSMC 4 nm foundry node, with which it can secure volumes; compared to Intel's "Lunar Lake" and upcoming "Arrow Lake," which are both expected to use TSMC's 3 nm foundry node. ASUS already announced a mid-July media event where it plans to launch dozens of AI PCs, all of which are powered by Ryzen AI 300 series chips, and meet Microsoft Copilot+ requirements. Over on the data-center side, AMD's MI300X accelerator is receiving spillover demand from competing NVIDIA H100 GPUs, and the company plans to continue investing in the software side of this solution, to bring in large orders from leading AI cloud-compute providers running popular AI applications.

The improvements to the software ecosystem will take some time, AMD is looking at a three to five year timeframe, and to support that, AMD has greatly increased their software engineer headcount as mentioned before. They have also accelerated their hardware development: "we are going to launch a new [Radeon] Instinct product every 12 months," which is a difficult task, but it helps react quicker to changes in the software markets and its demand. On the CPU side, the company "now has two CPU teams, one does n+1 [next generation] the other n+2 [two generations ahead]," which reminds us a bit of Intel's tick-tock strategy, which was more silicon manufacturing focused of course. When asked about Moore's Law and its demise, the company also commented that it is exploring "AI in chip design to go beyond place and route," and that "yesterday's war is more rasterization, more ray tracing, more bandwidth," the challenges of the next generations are not only hardware, but software support for nurturing relations with software developers plays a crucial role. AMD even thinks that the eternal tug-of-war between CPU and GPU could shift in the future: "we can't think of AI as a checkbox/gimmick feature like USB—AI could become the hero."

What's heartening though is that AMD has made the bold move of mobilizing resources toward hiring software talent over acquiring another hardware company like it usually does when its wallet is full—this will pay off in the coming years.
Add your own comment

139 Comments on AMD is Becoming a Software Company. Here's the Plan

#2
john_
AMD is Becoming a Software Company
The best title I read the last few years, if it ends up true. Because with Intel closing the gap in manufacturing and also having the chance to be again ahead if it's manufacturing works and ARM starting to eat from the CPU pie, AMD needs to become a second Nvidia and that means ALSO software.
Posted on Antwort
#3
Jack1n
Every masterpiece has its cheap copy.
Posted on Antwort
#4
Ferrum Master
They should not have mentioned Tesla here.

My bet? They will flop hard, it is too late.

AI? We all know it will not take off. It is a marketing tool, nothing else.
Posted on Antwort
#5
R0H1T
Jack1nEvery masterpiece has its cheap copy.
Namely?
Ferrum MasterAI? We all know it will not take off. It is a marketing tool, nothing else.
Not the "AI" we know from Ex Machina but pretty sure it's having a profound impact everywhere, even more so in the future!
Posted on Antwort
#6
Kenjiro
Jack1nEvery masterpiece has its cheap copy.
Oh, You mean NVidia is a cheap copy? Shame on You!
Posted on Antwort
#7
Daven
I also hope AMD stops acting like a paranoid small business too scared to realize that they are a bigger player now.

Two cases in point:

Roadmaps - AMD must disclose more about future products without fearing the competition will just copy them. Keeping plans secret until the last minute means less time for ISVs to understand your technology.

Product exclusivity - AMD has a very desirable SoC inside the Xbox and PS. But they are so scared of losing these two customers that they limit the features of their broader product stack. Apple finally forced their hand with the M series and now we are getting Strix Halo but it would have been nicer if AMD did this without being forced.

Act like no one can live without you and then customers will stay customers no matter what you do.
Posted on Antwort
#8
R0H1T
I think it was partly about the price as well, we've had this talk on here for a while now like who would pay for a $2000 APU only AMD laptop? And at least back then the overwhelming sentiment was ~ not too many people! Apple opened up that market for other OEM's as well, now you can potentially have a massive IGP + 16c/32t on a laptop & people will "probably" pay for it.

Probably because it would need to perform well & come close to Apple's efficiency levels, that's still about a year or two away realistically speaking.
Posted on Antwort
#9
64K
Interesting read. There's a lot that goes on behind the scenes that I would miss if not for good coverage and explanations like here in articles.

AMD isn't moving in every direction that pleases everyone. imo too many are focused on lack of competition against Nvidia's flagships but they downplay just how successful AMD has become by AMD's focus on what matters most to them.
Posted on Antwort
#11
Ferrum Master
R0H1TNamely?


Not the "AI" we know from Ex Machina but pretty sure it's having a profound impact everywhere, even more so in the future!
You mean if-else accelerated parsing, that's AI in disguise for most code we really see around? Maybe, but I cannot see it as nothing else another instruction set like MMX, SSE, AVX512... they just decided to market it, justify another race of useless upgrade blocks, that return you nothing. NPUs with TOPs xxx and now this Gen has that much... it doesn't work anymore.

To be fair... AMD has so much abused Foss in reality and the manners how they maintain ROC, the documentation is pretty arse, even understanding what arch is supporting what, it is all a guessing game... I am not so optimistic on this all PR talk when I look at the current reality and being an AMD user. It becomes better, but that's thanks to community, they forget that, it will backfire them if they will continue to make stupid decisions, at certain degree it already backfires as the comments are not praising.

I still believe tailored ASICs will come and mop the floor with tailored brute force versus the general usage GPU like pipes, that are still tailored for various data type, jack of all trades, master of none, and that's game over for nvidia and any others trying to upsell these things now. It's a trap and they are desperate. The problem with such approach... they are stupid and cheap, they won't be able to charge so much to make it profitable for them.

Quantity does not mean quality... that's the thing bad leaders in tech industry fail to understand. It all takes a talented tight working team and often smaller ones are better and once you have too much, it becomes a circus everyone looking on each other and waiting the other guy to do something(hello Intel). And those talented guys aren't that much around the globe and most are taken.
Posted on Antwort
#12
Prima.Vera
So why AMD isn't capable of producing AI accelerator as nGreedia is?! Their GPUs are almost as powerful as nGreedia's. What's the holdup?? Just write a different API for it and create similar cores as CUDAs in NVidia... Or..?
Posted on Antwort
#13
vmarv
Wasn't already a software company? They did some programs and all failed along the way. The AMD Radeon ramdisk and the the AMD Radeon ProRender come to mind.
The ProRender ended up being faster on the NVIDIA cards.
They closed the gap in the cpu world, but in the gpu market they should have invested in the research of a valid CUDA/RT/Tensor cores alternative. The gap is now so big, that it's impossible to close.
I mean, in 2023 (if I remember well) they released a driver update that improved significantly the performance of their cards in open gl (useful in the programs that use it, such CAD software), after many years of silence. This isn't the right way to do things if you want to stay relevant in a field where there are thousands companies working all over the world and some softwares that are the standard used in the industry. Think about the CAD, VFX, 3D, animation, video editing and photo editing industry. It's a market that is worth billions and NVIDIA during the years has become the leader of it. And now there's the AI thing. AMD is too far behind.

One useful software that they can develop could be a 7zip/winrar alternative, powered by the gpu.
Winzip has the option to support gpu acceleration (open cl): I tried it and was blazing fast, wayway faster then the cpu method used both by winzip and the competitors (meaning less than a minute instead of minutes). Unfortunally winzip is buggy and has an horrid interface, so it's a no-go. Well, AMD should step in.
And of course they should try to do a partnership with some big software house (like Adobe or Maxon) and work on a faster alternative to what NVIDIA has to offer with these programs. But meanwhile the green monster won't stay asleep.
Posted on Antwort
#14
R0H1T
Ferrum MasterYou mean if-else accelerated parsing, that's AI in disguise for most code we really see around? Maybe, but I cannot see it as nothing else another instruction set like MMX, SSE, AVX512... they just decided to market it, justify another race of useless upgrade blocks, that return you nothing. NPUs with TOPs xxx and now this Gen has that much... it doesn't work anymore.
I meant any place/work where you can justify laying off people for the new(AI) fad. Like any number of startups or even middling companies who replaced a majority of their support staff with "AI" bots. You can also do that in manufacturing & even "science" or medicine. If you look around you almost 70-90% jobs are repetitive in nature, with enough "incentives" i.e. shareholder value you can replace probably the majority of them in the coming decades. Yes I know more jobs will be created as well but "AI" in general will force a lot of hardships for years to come :shadedshu:
Posted on Antwort
#15
bitsandboots
I want to believe.
NVIDIA's advantage here is that they have just one kind of accelerator—the GPU, which runs CUDA...

On the other hand, this is also a weakness of NVIDIA, and an advantage for AMD. AMD has a rich IP portfolio of compute solutions, ranging from classic CPUs and GPUs, to XDNA FPGA chips (through the Xilinx acquisition), now they just need to bring them together, ... to sell customers a single-product combined accelerator system comprised of components like a CPU, GPU and specialized FPGA(s)
Aside from nvidia not having x86, it's not like nvidia cannot do the same.
They already do, right? See www.techpowerup.com/280906/nvidia-announces-grace-cpu-for-giant-ai-and-high-performance-computing-workloads
Posted on Antwort
#16
ymdhis
KenjiroOh, You mean NVidia is a cheap copy? Shame on You!
Yeah, of SGI.
Posted on Antwort
#17
mouacyk
The only frontier left is to compete. Sounds too much like a copout
Posted on Antwort
#18
Ferrum Master
R0H1TI meant any place/work where you can justify laying off people for the new(AI) fad. Like any number of startups or even middling companies who replaced a majority of their support staff with "AI" bots. You can also do that in manufacturing & even "science" or medicine. If you look around you almost 70-90% jobs are repetitive in nature, with enough "incentives" i.e. shareholder value you can replace probably the majority of them in the coming decades. Yes I know more jobs will be created as well but "AI" in general will force a lot of hardships for years to come :shadedshu:
Don't be afraid of changes. At least it is not like after war. We are human designed to adapt. If you work such type of work that is replicable by AI so be it... just as press, typewriters, smiths and tailors are also diminished. So what? I would be actually pleased to get rid of office plankton I have around... I rather communicate with an AI than those backstabbing snakes, that actually know nothing besides gossip and if you ask them to work, they open their eyes and are angered - how dare you.

It is really a brainwash song for shareholders. Imagine the engineering team now at AMD. What perspective and future, now you are second rate citizens here... Jim Keller is always right, do your job and get away, otherwise it gets personal.
Posted on Antwort
#19
Assimilator
A decade late and many dollars short, I'm afraid. It will take significant time for AMD to hire the right people, and enough of them, to build a team capable of delivering software as good as CUDA; it will take significant time to build that software's features to a level somewhat competitive with CUDA; and it will take significant time to convince the market to commit to AMD's software over CUDA. The last point is particularly relevant, because even with massive incentives it's really difficult to dislodge a good incumbent - and NVIDIA's offering is very, very good.

Three to five years is a very optimistic target, especially given that the "AI" hype train will have derailed by the latter date. Gonna be real difficult to justify continued spending of millions on software engineers, if the potential ROI has vanished. So while I welcome this initiative, I really don't see it panning out how AMD management is hoping. Best of luck to y'all and I hope it works out despite these odds...
AMD is aware of the challenges, especially after Intel (Meteor Lake) and Microsoft (Copilot) spammed us with "AI" everywhere
@W1zzard this is a pretty hypocritical thing to say given that AMD has literally put "AI" in the fucking name of its CPUs, and Intel has not. Not to mention that they talk about "AI GPUs", whatever that bullshit may be...
Posted on Antwort
#20
Vayra86
AssimilatorA decade late and many dollars short, I'm afraid. It will take significant time
This. Then again, NOT starting on it is a sure fire way to leave the building.
Posted on Antwort
#21
Onasi
This honestly feels like something that should have started the moment AMD acquired ATI. And then they wasted that momentum on weird initiatives like pushing APUs as the “future of heterogeneous computing” and making GPUs with massive compute potential (higher than that of NVIDIA at the time), but with absolutely no software stack to even support it. It’s good that they finally woken up to the reality that apart from console chips the Radeon division was essentially a dead weight stuck in limbo for a decade now, but they have to actually commit to transforming themselves this time around and prepare contingencies other than banking heavily on the AI fad.
Posted on Antwort
#22
neatfeatguy
Despite having a CPU microarchitecture that at least matches Intel, the company barely commands a quarter of the market (both server and client processors combined); and despite its gaming GPUs being contemporary, it barely has a sixth of this market. This is not for a lack of performance—AMD makes some very powerful CPUs and GPUs, which are able to keep competitors on their toes. The number-one problem with AMD's technology has been relatively less engagement with the software vendor ecosystem
While I don't disagree, I'd like to point out that AMD's advertising - compared to Intel - is basically non-existent even though they were in competition with Intel back before I knew anything about them.

Growing up I only knew of Intel before I started to really get into computers, specifically from commercials and their jingle with the slogan "Intel inside." Intel was and still is the heartbeat of the CPU industry when it comes to home based systems (the laptops and desktops many people buy for daily home use) and office systems. When my step-dad came home with the top of the line PC back in '97 that housed a Pentium II 300....that was amazing! It had that "Intel Inside" sticker and a "Pentium 3" sticker on it.

In '99 when I did my first year of college I was irked that the computer they got me was a PowerSpec brand with an AMD Athlon in it. AMD? What the hell is AMD? This is some bullshit! Cool, it was a computer that I needed and could use, but AMD? My roommate had a Pentium 3, that lucky bastard. Over time I had come to find out that my Athlon pretty much walked all over his Pentium 3 and I couldn't understand why AMD wasn't a better household name. My roommate was pissed my computer was faster and his parents spent more on it. As I started to think about it, their advertisement was pretty much non-existent back then much like it is today.

I know you have the shady underhand tactics that Intel used and was fined for in the past, but still. Even if that wasn't the case, to this day, a lot of people still don't understand or know there are multiple companies out there making chips for desktops/laptops and so on. They just think Intel. And that is where I think AMD's greatest failure is.
Posted on Antwort
#23
Jism
Obvious. Intel does have a strong brand. But it became very obvious that in those days clockspeed was not everything. There where chips out there like AMD or Cyrix that where clocked slower but executed instructions faster. Many people did not understand that concept.

AMD is still in the game, and by switching from hardware focus to now software is the best route to go. You'll get a chance to extract all performance based on AMD chips.
Posted on Antwort
#24
DaemonForce
AMD has enough product diversity and experience in each market to make good on this choice.
AMD also has enough of the right people in the right places to hit the ground running on sudden sharp course corrections like this one.
Will it be perfectly smooth? Nah.
Will it be an interesting addition of product value and desperately needed competition in the market? Absolutely.
Can't wait.
Posted on Antwort
#25
kiddagoat
OnasiThis honestly feels like something that should have started the moment AMD acquired ATI. And then they wasted that momentum on weird initiatives like pushing APUs as the “future of heterogeneous computing” and making GPUs with massive compute potential (higher than that of NVIDIA at the time), but with absolutely no software stack to even support it. It’s good that they finally woken up to the reality that apart from console chips the Radeon division was essentially a dead weight stuck in limbo for a decade now, but they have to actually commit to transforming themselves this time around and prepare contingencies other than banking heavily on the AI fad.
I was thinking the same thing after reading this. I remember seeing all the Fuzion is Future marketing and there were big promises made then as well. I really hope they do make a turnaround in this venture. They have always had impressive hardware specs on paper, but the implementation has seemed to always miss the mark. The hardware isn't as good if there isn't the software ecosystem around to support it.

The last AMD products I owned were the Fury X/Nano. I have tried some of their newer GPUs, but I just haven't had the best luck with them. When using DVDFab for video work, it seems that CUDA is better optimized than AMD APP.

Their All In Wonder products were great back in the day, the software seemed to function really well. I am surprised they haven't done something like an AppleTV or a Nvidia Shield product.
Posted on Antwort
Add your own comment
Sep 3rd, 2024 09:32 CDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts