Intel is asking for an additional $10 billion from CHIPS act subsidies because the chip giant feels it deserves more cash for investing in US developments

A drone photo of Intel's new Fab 9 in Rio Rancho, New Mexico
(Image credit: Intel)

Poor Intel. Last year was pretty rough for the 55-year-old semiconductor firm, as it accrued just $54.2 billion in revenue, 14% less than the year before. After paying all its bills for manufacturing, research and development, and biscuits, there was just $1.7 billion left over in net income. Poor Intel. 

So when the US administration announced the CHIPS and Science Act in 2022, with a total of $280 billion up for grabs, Intel jumped right in to get some of that golden booty. Only now it's asking for a further $10 billion, at the very least, to ensure Intel's US developments can continue.

This news of Intel's re-enactment of Oliver Twist (via Wccftech) isn't in the least bit surprising and not because of the company's recent financial results. Cutting-edge semiconductor manufacturing plants are extremely expensive to design, build, or even just refurbish. For example, Intel's Fab 9 plant recently opened in New Mexico, as part of the $3.5 billion investment plan in that region.

One aspect of the new plant is its ability to handle Intel's Foveros packaging technology, that's used to bond multiple chiplets (or tiles, as Intel calls them) onto the same piece of silicon. The Core Ultra range of laptop processors was the first to sport the latest version of this tech in a consumer-grade piece of hardware, and the forthcoming Arrow Lake and Lunar Lake designs will use it, too.

Hence why it's understandable Intel would go back to the US administration to wrangle a bigger slice of the CHIPS Act funding pot and an extra $10 billion would certainly help keep more of Intel's chip manufacturing in the US.

It's also tiny in comparison to the amount of money some companies have been requesting. For example, SoftBank (tech investment firm and owner of Arm) is trying to get $100 billion to create a startup that can beat Nvidia in the AI game, and OpenAI founder Sam Altman has suggested that it will take trillions of dollars to reshape the semiconductor industry into something that's able to cope with the future demands of compute and AI.

Compared to those figures, a mere ten thousand million dollars is almost nothing.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

But let's say Intel does get that extra money: What difference would it make to the consumer? Well, you're almost certainly not going to be seeing cheaper Core processors on shelves because a lot of that money will be in the form of loans, which will have interest on them and will need to be paid back in a set amount of time.

There's also the fact that thousands of people will need to be educated and trained in the knowledge and use of high-end silicon chip manufacturing, and there's always the risk that such an investment doesn't pay off, as workers could leave before Intel generates sufficient revenue.

And it doesn't have the chip market to itself, of course. AMD has a decent slice of the x86 processor segment and when it comes to GPUs and AI chips, Nvidia rules the roost right now.

At best, you're just going to see a little stamp on your Core Ultra saying 'Made in USA,' maybe even etched onto the chip itself; that's got to be worth the money, yes? So come on, politicians: Look into those watery eyes and hear the plea of 'Please Sir, can I have some more?' What a lovely tale we could tell around the Christmas tree, as we roast our chestnuts on a new 400W Intel CPU.

Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?