Off-Prem

ASUS quietly built supercomputers, datacenters and an LLM. Now it's quietly selling them all together

The plan is a slow build – not a breakout into enterprise tech


Taiwan's ASUS is best known for its laptops and Wi-Fi kit, but it's quietly building an enterprise tech and cloud business – and slowly introducing it to the world after big successes at home.

The Register learned of ASUS's plans at last week's Computex conference in Taiwan, where we met Jackie Hsu, a senior vice president Jackie Hsu, who also serves as co-head of the Open Platform Business and IoT business groups.

Hsu pointed out that ASUS helped to build the Taiwania 2 supercomputer – a nine-petaflop machine that hit the Top 500 Supercomputer list at number 20 when it debuted in 2018.

And last year it won a bid to help build the Taiwania 4 supercomputer. Hsu told us ASUS built a datacenter to house Taiwania 4, and achieved a power use efficiency (PUE) rating of 1.17 – a decent achievement for any facility, but a very good one in a hot and humid location like Taiwan.

Another little-known ASUS initiative is the Formosa Foundation Model – a 176 billion parameter large language model (LLM) tuned to generate text with traditional Chinese semantics. Hsu said LLMs trained on data in local languages are essential, as the corpus used to train most such models is dominated by American English.

ASUS also offers servers – vanilla models, nodes for supers, and the AI servers announced last week at Computex – and has done for years without becoming a major player in the field. But Hsu told The Register that the Taiwanese giant has engaged with hyperscalers who considered it as a supplier for their server fleets, and was able to demonstrate it can produce exceptionally energy-efficient machines.

ASUS is now putting together all of the above as an offering to clients. Hsu said he's already engaged with customers who could not match ASUS's ability to build datacenters with 1.17 PUE and seen interest in the Formosa Foundation Model.

The senior vice president said ASUS has already entered several engagements in which it designs and build substantial systems to run AI, offering much of the software and hardware stack needed to do the job.

Hsu conceded that ASUS's small scale as a server maker compared to rivals means it cannot always compete on price – but said clients are willing to pay for its complete offering.

"This is definitely a big growth area for us," he told The Register.

For now, the company is moving quietly. Over time, Hsu hopes ASUS will become more of an enterprise player. And with demand for compute surging along with interest in AI, it has a chance to succeed – in its own neighborhood and beyond. ®

Send us news
Post a comment

US bipartisan group publishes laundry list of AI policy requests

Chair Jay Obernolte urges Congress to act – whether it will is another matter

Take a closer look at Nvidia's buy of Run.ai, European Commission told

Campaign groups, non-profit orgs urge action to prevent GPU maker tightening grip on AI industry

Infosec experts divided on AI's potential to assist red teams

Yes, LLMs can do the heavy lifting. But good luck getting one to give evidence

$800 'AI' robot for kids bites the dust along with its maker

Moxie maker Embodied is going under, teaching important lessons about cloud services

AI's rising tide lifts all chips as AMD Instinct, cloudy silicon vie for a slice of Nvidia's pie

Analyst estimates show growing apetite for alternative infrastructure

Million GPU clusters, gigawatts of power – the scale of AI defies logic

It's not just one hyperbolic billionaire – the entire industry is chasing the AI dragon

Cheat codes for LLM performance: An introduction to speculative decoding

Sometimes two models really are faster than one

Apple reportedly building AI server processor with help from Broadcom

Something called 'Baltra' expected to make its debut in 2026, perhaps with tech both already use

Are you better value for money than AI?

Tech vendors start saying the quiet part out loud – do enterprises really need all that headcount?

Apple called on to ditch AI headline summaries after BBC debacle

'Facts can't be decided by a roll of the dice'

Just how deep is Nvidia's CUDA moat really?

Not as impenetrable as you might think, but still more than Intel or AMD would like

American cops are using AI to draft police reports, and the ACLU isn't happy

Do we really need to explain why this is a problem?