Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop
Tag:

Nvidias

Nvidia's native support for Logitech racing wheels for GeForce Now has me excited for sim racing on a budget
Game Reviews

Nvidia’s native support for Logitech racing wheels for GeForce Now has me excited for sim racing on a budget

by admin August 20, 2025


Nvidia has announced a huge raft of changes and improvements to their GeForce Now cloud gaming service as part of their Gamescom 2025 announcements, but it’s actually one of the smallest sections that has me most excited.

As part of their extensive press release covering exciting updates such as RTX 5080 power for GeForce Now Ultimate subscribers and the ability to play games at up to 5K2K 120fps on supported screens, one of the footnotes near the bottom mentions the following:

Support for popular peripherals also grows, with native support for many Logitech racing wheels offering the lowest-latency, most responsive driving experiences.

That’s right, folks – GeForce Now now has native support for Logitech G29 and G920 racing wheels for playing the service’s selection of sim racing titles, granting important force feedback and more analogue controls versus a mouse-and-keyboard setup or even a controller. Indeed, this has been quite the popular request on forums for a number of years, so it’s pleasant to see Nvidia respond.

At a recent Gamescom event, deputy tech editor Will and I had the chance to go hands-on with a demo rig Nvidia had set up (pictured above) using a budget Logitech G920 wheel on a proper cockpit playing arcade racer The Crew Motorfest. It perhaps wasn’t the most hardcore sim racing setup in terms of game or gear, but it was still an effetive demo that proved out the concept.

I didn’t have any issues with the gameplay experience, in terms of stutters or input latency, and was largely impressed by what’s become possible with the cloud gaming space. Of course, with the venue in Cologne offering gigabit speeds to a regional data centre, it’s easy to see this as a best-case scenario that will have to be borne out in real-world testing on less capacious connections. The main thing was that the game’s force feedback was present and correct, whether I was drifting around roundabouts, running up the highway, or crashing off-road. Having used the G29 and G920 for several years at home, the cloud version didn’t feel any different.

Wheels such as this Logitech G29 are natively supported in GeForce Now.

The big thing for me is that it involved no computational power from the host device itself – in this instance, it was some form of small Minisforum mini PC, but Nvidia also had games running natively on LG TVs (4K 120fps with HDR is now accessible on 2025/2026 LG TVs with the new GeForce Now update) or off an M4 Mac Mini. Theoretically, this means all you need is a wheel, some kind of computer or device with support for the wheel, and a GeForce Now subscription, and you can be up and running – no need for a dedicated gaming or living room PC.

Of course, that is the whole point of cloud gaming, but it adds another string to your bow if you’re a current GeForce Now subscriber and you’ve felt the lack of a proper racing experience has been a sore miss. In addition, if you’ve already got a Logitech wheel from years ago and you want to jump into sim racing without the faff of a PC and such, then you can pay the subscription, and away you go.

An Nvidia representative told me that the technical difficulty was passing through effects such as force feedback in respective games over the cloud, while the reason they chose Logitech peripherals initially was due to the convenience of their G Hub software in part, which is running in a compatibility layer of sorts to get the wheels to work. They also chose Logitech because of the wide range of wheels they do, with the G29 and G920 being the only supported models at present, with more wheels to be supported in the future.

Before I go, I’ll provide a quick rundown of the other key additions for GeForce Now:

  • Implementation of Blackwell architecture – RTX 5080 is now the ‘Ultimate’ tier, bringing DLSS 4 MFG and so on, plus streaming at up to 5K 120fps.
  • ‘Cinematic Quality’ mode for better extraction of fine detail in areas where the encoder would previously struggle.
  • More devices supported with native apps, including Steam Deck OLED at 90fps (to match the refresh rate), plus some 2025+ LG TVs at 4K/120fps.
  • Support for 1080p/360fps and 1440p/240fps streams for competitive esports title, involving Nvidia Reflex and sub 30ms response times. (We saw 17ms figures in Overwatch 2, for example.)
  • A GeForce Now installation of Fortnite integrated into the Discord app, providing a limited-time trial of GeForce Now’s 1440p ‘Performance’ tier, requiring only connection between an Epic Games and Discord account.
  • ‘Install to Play’ feature in GeForce Now app, which more than doubles the playable titles to some 4500, giving access to over 2,000 installable games through Steam alongside Nvidia’s fully-tested ‘Ready to Play’ games. Installs must be repeated each session, unless you pay for persistent storage in 100GB+ increments.

It’ll be fascinating to see whether Nvidia continues to expand their peripheral support over time, as I’m sure flight sim fans could also benefit from a cloud-streamed version – especially with the CPU and GPU requirements that Flight Sim 2020 and 2024 entail.



Source link

August 20, 2025 0 comments
0 FacebookTwitterPinterestEmail
Nvidia’s app gets global DLSS override and more control panel features
Gaming Gear

Nvidia’s app gets global DLSS override and more control panel features

by admin August 19, 2025


The Nvidia app is getting improvements to DLSS override, more control panel features, and Project G-Assist changes this week. Nvidia has been gradually improving its new app over the past 18 months since its release, and it’s getting closer to fully migrating all the legacy control panel options.

This week’s Nvidia app update will include anisotropic filtering, anti-aliasing, and ambient occlusion options, meaning you won’t have to navigate to Nvidia’s older control panel app to improve classic games. The setup tool for Nvidia Surround will also be part of the Nvidia app now.

You also won’t have to configure DLSS override features on a per game basis anymore, as Nvidia is now adding a global option. You can set your DLSS preferences across all override supported games, and Nvidia’s overlay will also show which DLSS settings are active if you toggle this option on.

Nvidia is also bringing its new Smooth Motion feature, which was previously exclusive to RTX 50-series GPUs, to all RTX 40-series owners. It’s a driver-based AI model that enables smoother gameplay for games that don’t support DLSS Frame Generation. Smooth Motion can be applied to games running with DLSS Super Resolution, at native resolution, or even titles with other upscaling technologies. Nvidia says it will typically double “the perceived frame rate.”

If you’re a fan of Nvidia’s G-Assist AI assistant, Nvidia is changing the AI model behind the scenes so it will use 40 percent less memory. The smaller footprint won’t affect performance either, as it’s designed to respond even faster to queries.

These latest Nvidia app changes will arrive on August 19th for beta users at 9AM PT / 12PM ET, followed by a general release next week.



Source link

August 19, 2025 0 comments
0 FacebookTwitterPinterestEmail
GeForce RTX 5090D
Gaming Gear

Even Nvidia’s China-specific RTX 5090D falls victim to the infamous 16-pin melting issue

by admin June 21, 2025



The Chinese-exclusive GeForce RTX 5090D, which has the potential to rival the best graphics cards, is the latest Blackwell graphics card to be affected by issues related to 16-pin (12VHPWR) power connector meltdowns. Uniko’s Hardware has unearthed two recent instances of the GeForce RTX 5090D with melted 16-pin power connectors, reported on the Baidu Tieba forums.

With the introduction of the revised 12V-2×6 power connector, we had hoped the meltdown issue was behind us. However, doubt still remains since every once in a while, we see a user report pop up on the Internet. While reports of 16-pin power meltdowns have become less frequent, they haven’t disappeared entirely. As we’ve seen, there is no specific incubation period for the meltdowns; it can occur a few days or weeks after the build, or sometimes even years later.

A Baidu Tieba user reported that his Aorus GeForce RTX 5090D Master Ice functioned well for two months before the 16-pin power connector melted. He reportedly used the native 16-pin power cable supplied with his Segotep KL-1250G power supply. Segotep, a well-known Chinese manufacturer, has over two decades of experience under its belt. The KL-1250G is an ATX 3.0 power supply rated at 1,250W and certified for 80 Plus Gold efficiency.


You may like

Another forum user recounted his unfortunate experience with his Gainward GeForce RTX 5090D graphics card and an Asus ROG Loki power supply. He did not specify the exact model he possessed but merely stated that he utilized the native 16-pin power cable. In his situation, the 16-pin power connector melted on both the graphics card and the power supply side.

In the first case, forum members criticized the user’s Segotep KL-1250G power supply. One participant mentioned that this power supply was frequently associated with the previous GeForce RTX 4090 meltdowns. Although Segotep is an established name in the Chinese market, we cannot vouch for the quality of its products. In the second case, however, the user deployed an Asus ROG Loki unit, so its quality was not in doubt.

There have been many theories about what’s causing the 16-pin power connector meltdowns on Nvidia’s GeForce RTX 40-series (codenamed Ada Lovelace) and GeForce RTX 50-series (codenamed Blackwell) graphics cards. However, one of the most prominent theories is that Nvidia’s revised PCB design for Ada Lovelace and Blackwell has effectively eliminated load sensing and balancing. On the contrary, the older GeForce RTX 30-series (codenamed Ampere) graphics cards had this feature. For this reason, the GeForce RTX 3090 Ti, despite having the same 450W TDP as the GeForce RTX 4090, never suffered from melting connectors.

Follow Tom’s Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.



Source link

June 21, 2025 0 comments
0 FacebookTwitterPinterestEmail
Tested: Nvidia’s GeForce Now just breathed new life into my Steam Deck
Product Reviews

Tested: Nvidia’s GeForce Now just breathed new life into my Steam Deck

by admin May 29, 2025


I don’t want gaming to become another streaming subscription service that keeps going up in price. I don’t want to put even more power in Nvidia’s hands, particularly not right now.

But I can’t deny that the company’s $20-a-month* GeForce Now is a near-perfect fit for the Steam Deck. I’ve been covering cloud gaming for 15 years, and this is the very first time I’ve wanted to keep playing indefinitely.

For the uninitiated, Nvidia’s GeForce Now is a game streaming service that farms the graphical processing power out to the cloud. Instead of controlling a game running locally on your Steam Deck’s chip, you’re effectively remote-controlling an RTX 4080-powered* gaming rig in a server farm many miles away, which you sync with your existing Steam, Epic, Ubisoft, Xbox, and Battle.net accounts to access your games and savegames from the cloud.

*Nvidia’s GeForce Now also technically has a free tier, and a “Performance” tier, but I recommend you ignore both. For me, it was the difference between playing many games through a clean window or a dirty window, the difference between playing Alan Wake II and Indiana Jones with full ray tracing or none at all, the difference between comfortably stretching to 4K or not.

Handhelds have already become my favorite way to play games. The Steam Deck is comfortable and easy to pick up whenever and wherever the mood strikes. But neither my Deck nor my aging desktop PC have kept up with the latest titles. Clair Obscur: Expedition 33 and Baldur’s Gate 3 can look like a fuzzy mess on a Deck, and I’ve never seen Alan Wake II, Portal RTX and Indiana Jones and the Great Circle in all their ray-traced glory on my RTX 3060 Ti desktop.

But today, with Nvidia’s just-now-released GeForce Now app for the Steam Deck, I can play every one of those titles at near-max settings, anywhere in my home, for hours and hours on a charge. And if I dock that Steam Deck to my 4K TV, it can output 4K60 HDR and/or ray-traced graphics that put the PS5 Pro to shame.

When we tested GeForce Now’s last big upgrade in 2023, Tom and I agreed it wasn’t quite on par with playing on a native PC.

But on a Steam Deck, where I’m either playing on a low-res handheld screen or sitting across the room from a TV where I don’t notice so many imperfections, it can feel like the best of both worlds.

Here’s what Expedition 33 looks like running natively on my Steam Deck today, versus the Deck with GeForce Now:

The best part might be this: while handhelds like the Steam Deck barely get two hours of such a game at potato graphical settings, I could get 7 to 8 hours of GeForce Now. I saw the cloud gaming service consistently sip under 7 watts from my Steam Deck OLED’s 49.2 watt-hour battery, barely more than the system consumes at idle.

And the new native app makes it a cinch to set up, with no more web browser-and-script workaround: just hold down the power button and switch to desktop mode, download the app, run it, and scan QR codes with your phone to link your various accounts.

Oh, you’d best believe there are caveats. Giant gaping gotchas galore, which I’ll explain as we go. But after testing the service for nearly two weeks, I’m starting to believe in cloud gaming again.

Now, you might be wondering: how the heck am I playing a game where timing is so critical via remote control? Here’s the first big caveat: you need a low-latency internet connection, a good Wi-Fi router or wired ethernet, and you need to be within range of Nvidia’s servers for the magic to work. Download speed isn’t as key: 50Mbps should suffice for 4K, and you can get away with less.

But I’m armed with a AT&T Fiber connection, and I live maybe 30 minutes away from Nvidia’s San Jose, California servers, which makes me a best-case scenario for this tech. Still, Nvidia has over 35 worldwide data centers now, including 14 distinct locations in the United States, and my colleagues with Xfinity and Spectrum cable internet in Portland and Brooklyn tell me Expedition 33 played just as well for them.

Rough server locations for GeForce Now; you can get a better idea by Image: Nvidia

“The latency was negligible to the point that I wasn’t missing parries,” Cameron Faulkner tells me, saying he nailed the Sad Troubadour on the first try. Jay Peters and I found we needed to adjust our timing a bit, but I wound up playing roughly half the game over GeForce Now and almost never looked back.

Even with the best of connections, though, GeForce Now isn’t bulletproof. Once or twice a day, my seemingly stable gameplay session would at least briefly unravel into a choppy mess.

In single-player games like Expedition 33 and Indiana Jones I could easily forgive a few minutes of trouble, but my colleagues Antonio Di Benedetto and Erick Gomez saw it in otherwise stable twitch shooters where lag could be a bigger issue. “I saw a handful of lag spikes / hiccups that would definitely screw anybody over in a competitive shooter, but thankfully they weren’t at the worst times and they soon subsided,” Antonio tells me.

You also give up some of the Steam Deck’s portability. While you can plug and unplug the Steam Deck from a TV dock and seamlessly switch between big screen and small screen play, you can’t just put the Steam Deck to sleep without ending the session and losing unsaved progress. (Unlike, say, Chiaki.) And although the native GeForce Now app supports 4K60, a big leap up from 1440p, you may find yourself squinting at tiny text because it doesn’t scale the UI appropriately.

Also: while GeForce Now also supports a lower res but smoother 1440p 120Hz mode on TVs and even other gaming handhelds, it doesn’t offer a 90Hz mode for the Deck OLED yet. I tested at 60Hz instead.

Tiny text. Screenshot by Sean Hollister / The Verge

Speaking of portability, public Wi-Fi generally isn’t good enough for GeForce Now, and neither are most cellular connections — even with four bars of Verizon 5G UWB service and a wired USB tether to my phone, my stream quickly deteriorated into the jumble you see below. Only the very best cellular connection in my entire neighborhood, a spot right under a 5G tower where I can get 1,200Mbps down and 30 millisecond ping, felt playable to me.

This is on four bars of Verizon 5G UW. It actually got worse after this, with ping in the 500ms range. Screenshot by Sean Hollister / The Verge

And, as we’ve discussed previously, only the $20-a-month Ultimate tier truly shows what cloud gaming is capable of. Expedition 33 looked substantially worse on the Performance tier (Epic spec, native resolution, vs. Medium spec, 50 percent resolution with DLSS) and Indiana Jones went from gorgeous to just “playable while handheld” for me.

But the biggest caveat with GeForce Now may be outside the company’s control: you have to bring your own games, and yet you only can bring games where Nvidia has explicitly struck a distribution deal.

Nvidia has made progress: 165 of my 457 Steam games are now available to play, up from 85 two years ago. The company offers over 2,100 games in total across Epic, Battle.net, Ubisoft, Xbox, and GOG too. But Nvidia has no games from Sony, so I’m not playing Helldivers 2, no games from Rockstar, so I’m not playing GTA V or Red Dead Redemption 2, and no Elden Ring, no PUBG, no Schedule I or Football Manager or FIFA or NBA or The Sims. We never quite know which games GeForce Now will get, or when, or if they might disappear.

Cloud gaming has never felt like a better deal, now that the service has matured, now that handhelds can make such good use of it, and now that buying your own GPU is such a ridiculously expensive proposition. Maybe I’ll defer my own next GPU upgrade in favor of a subscription.

But it’s not for everyone — you should definitely try a $8 GeForce Now Ultimate day pass first — and there’s still a lot of mental friction. I’m not looking forward to the day that Nvidia alters the deal further.





Source link

May 29, 2025 0 comments
0 FacebookTwitterPinterestEmail
Nvidia's RTX 2080 Ti revisited in 2025: seven years old - and it's still delivering
Game Reviews

Nvidia’s RTX 2080 Ti revisited in 2025: seven years old – and it’s still delivering

by admin May 29, 2025


They call it ‘fine wine’ – the concept of a PC component still delivering impressive performance years on from its release. Nvidia’s Turing architecture – the RTX 20 series cards – weren’t exactly well regarded at launch back in 2018 but with the RTX 2080 Ti, I’d say we’re looking at fine wine at its best. Its performance today battles it out with the recently released RTX 5060, it has more memory than the new Nvidia offering and its outputs don’t decline on PCIe gen 3-based PCs… because it is a PCIe gen 3 card. Despite its seven year vintage, it’s still a card that outperforms the current generation consoles and even taps into some (though not all) of Nvidia’s latest neural rendering technologies. This is indeed fine wine, but fine wine with a chaser, if you like.

All of which raises an interesting question: a used RTX 2080 Ti costs pretty much the same as an RTX 5060 – so does this make it worthy as a used purchase for a budget PC? Well, AMD’s upcoming RX 9060 XT launch might have something to say about that, but yesterday’s flagship is certainly causing a headache or two for today’s 50-series mainstream offering – and emphasises the importance of an appropriate hardware balance between compute power, RT and machine learning features and available VRAM.

While the focus in this piece is about the RTX 2080 Ti, it would be remiss of me not to point out that the used market has a number of good options, all of them compliant with the DX12 Ultimate standard. AMD’s RX 6700 XT has more memory and is typically a fair amount cheaper second-hand. Meanwhile, the 16GB RX 6800 effectively solves the VRAM problem completely, but does tend to cost more than the 2080 Ti based on eBay completed sales results. In the video below, you’ll see how my benchmarking worked out with both of these AMD offerings, the RTX 5060 and the RTX 2080 Ti. Spoilers: the 2080 Ti wins on aggregate when put through our entire benchmarking suite, as the table below demonstrates. The video is worth watching for the most noteworthy results, however.

Here’s a video that shows how the RTX 2080 Ti holds up in 2025: benchmarks, custom testing, potential used purchase alternatives… it’s all in here.Watch on YouTube

Used GPUs vs RTX 5060 (FPS Averages)
1920×1080
2560×1440

Nvidia GeForce RTX 2080 Ti
71.0 (100%)
50.9 (100%)

Nvidia GeForce RTX 5060
68.4 (96.3%)
44.3 (87.0%)

AMD Radeon RX 6800
62.4 (87.9%)
44.3 (87.0%)

AMD Radeon RX 6700 XT
52.6 (74.0%)
35.9 (70.5%)

How it wins is quite fascinating. In dealing with rasterisation performance without VRAM constraints, the RTX 2080 Ti is effectively a ringer for the new RTX 5060 with many games operating at close to identical frame-rates. Ray tracing is another story: in some titles, the RTX 2080 Ti performs a lot better than RTX 5060. In other tests, 2080 Ti falls a touch short. The RTX 50-series Blackwell architecture seems to struggle with some RT titles, such as Avatar: Frontiers of Pandora and F1 24. In these scenarios, the RTX 2080 Ti can be a runaway winner. While the video above highlights the benchmarks that interested me, the results of all of our tests are aggregated into the accompanying table.

The RTX 2080 Ti also seems to answer the question of how much VRAM is appropriate for a card of this class. Even at 1080p resolution, we can find examples of the RTX 5060’s 8GB of framebuffer memory falling short, but looking at Marvel’s Spider-Man 2 and Monster Hunter Wilds specifically as examples, it highlights that while the raw horsepower is there to produce decent frame-rates with ray tracing active, you need more than 8GB to get the job done. While 12GB is more common, 11GB seems enough to make these games run well and to ace our benchmark suite where the RTX 5060 falls short.

I also spent some time doing some custom testing, similar to the RTX 5060 review, starting with PlayStation 5 console comparisons. So, here’s the thing. Generally speaking, console performance trends upwards against equivalent PC parts. The PlayStation 4’s GPU is effectively a customised Radeon HD 7850/7870 hybrid – but the results in the mid to late era of the console’s lifespan effortlessly outstrip what those GPUs produced. The RTX 2080 Ti not only predated the current-gen consoles by two years, but getting on for five years after their launch, it continues to power past their capabilities.


To see this content please enable targeting cookies.

Manage cookie settings

I could run Forza Horizon 5 at console equivalent performance mode settings and get a significant frame-rate advantage while using Nvidia DLAA anti-aliasing for superior image quality. True, that title has a 60fps cap, so we don’t quite see full GPU potential from the console – but the results of the 2080 Ti effortlessly outstrip consoles in Black Myth: Wukong and even Alan Wake 2. The Remedy game in particular is testament to the RTX 2080 Ti’s staying power: while AMD GPUs of the era struggle to keep up, the Turing architecture delivered tech like mesh shaders years ahead of RDNA 2, so the game still runs very, very well on Nvidia’s seven-year-old vintage flagship. Alan Wake 2 also recently received an upgrade for RTX Mega Geometry – another new neural rendering technology – and yes, it runs fine on the RTX 2080 Ti.

The new DLSS transformer model upscaler also runs well on Nvidia’s original RTX flagship, losing just six percent of its performance in my testing. That said, there are hints that perhaps the RTX 2080 Ti’s longevity with cutting-edge tech may be coming to an end. Yes, you can run all the latest RTX technologies on it – bar frame generation – but while the transformer model super resolution tech runs pretty well on older cards, Turing and Ampere GPUs suffer badly with transformer model ray reconstruction, which for RT titles is what I’d call a marquee feature. Ray reconstruction, at a basic level, is essentially an upscaler for ray tracing effects and can be transformative. I found that the older version of ray reconstruction delivered a 41 percent advantage – Turing just can’t hack it.

The legacy CNN model is still fine, but it has its issues and the transformer model ray reconstruction tech really is a generation beyond. The RTX 2080 Ti can run it, but that doesn’t necessarily equate with it running well. And some neural rendering features may never appear on the older cards. At the mainstream end of today’s market with RTX 5060 and RTX 5060 Ti, frame generation and multi frame-gen are not quite the “fire and forget” FPS boosting solutions that they are on the more expensive cards. And yet, I do think they have value and are worth having when most of today’s displays have high refresh rates and VRR support.

Top-left, RTX 2080 Ti can beat RTX 5060 conclusively when a game needs more than 8GB of memory. Meanwhile, both 2080 Ti and 5060 can exceed PS5 performance – sometimes with improved settings or via DLSS. | Image credit: Digital Foundry

So, returning to the question of acquiring an RTX 2080 Ti for a budget build, there are certainly pros and cons. The RTX 5060 doesn’t have enough VRAM for its performance level, while the 2080 Ti does. The 5060 uses a PCIe 5.0 interface, but it’s cut-back to x8 rather than x16. Performance degrades on older rigs based on PCIe 3.0 CPUs and motherboards – which doesn’t affect the RTX 2080 Ti at all. However, the RTX 2080 Ti has its disadvantages beyond poor performance with DLSS transformer model ray reconstruction: no frame-gen support and an uncertain future with Nvidia’s upcoming ML features.

More to the point, we’re talking about an enormous chip based on TSMC’s 12nm process. Compared to the RTX 5060 based on TSMC 4nm, the older card sucks up twice as much power (and sometimes more!). Obviously, this will have an impact on running costs – but it also means that your PC will need to be able to handle a much larger card kicking out a lot more heat. With that in mind, if you are considering a used RTX 2080 Ti, definitely consider a larger card with a big heatsink and three fans. I used a Founders Edition card for my testing and it reminded me of the painful days where I could burn my hand when swapping out a GPU! I guess the final ‘con’ of a 2080 Ti purchase is its sheer age – first launched in 2018, a used buy could have gone through a mining boom or two.

Even so, I love the RTX 2080 Ti ‘fine wine’ narrative. While many people are still holding on to the GTX 10-series cards, the truth is that the Pascal architecture lacks the features needed for all of today’s games. The RTX 2080 Ti has them all and still manages to run demanding titles at perfectly decent frame-rates, while DLSS continues to prove its worth. While other rival cards from the era can’t run certain games and as the “8GB is enough” VRAM era comes to a close, the RTX 2080 Ti continues to deliver – and it still outpaces PS5 and Series X. Nvidia’s vision for the future of graphics tech didn’t go down well with reviewers back in 2018 but today, the balance of features, memory and performance holds up. Can we say the same for its modern equivalent, the RTX 5090? I guess I’ll let you know in 2032.



Source link

May 29, 2025 0 comments
0 FacebookTwitterPinterestEmail
Maxsun Intel Arc Pro B60 Dual GPU
Gaming Gear

Intel’s surprise dual-GPU card just dropped with 48GB VRAM, and it might shake Nvidia’s pro lineup

by admin May 26, 2025



  • Intel’s Arc Pro B60 Dual offers pro-grade memory at a fraction of Nvidia’s price
  • This dual-GPU rig from Maxsun delivers workstation power
  • Each GPU gets one DisplayPort and one HDMI, avoiding OS overload in multi-GPU workstations

At Computex 2025, Maxsun unveiled a striking new entry in the AI hardware space: the Intel Arc Pro B60 Dual GPU, a graphics card pairing two 24GB B60 chips for a combined 48GB of memory.

Servethehomeclaims Maxsun envisions these cards powering dense workstation builds with up to four per system, yielding as much as 192GB of GPU memory in a desktop-class machine.

This development appears to have Intel’s implicit approval, suggesting the company is looking to gain traction in the AI GPU market.


You may like

A dual-GPU card built for AI memory demands

The Arc Pro B60 Dual GPU is not designed for gaming. Instead, it focuses on AI, graphics, and virtualization tasks, offering a power-efficient profile.

Each card draws between 240W and 300W, keeping power and thermal demands within reach for standard workstation setups.

Unlike some alternatives, this card uses a blower-style cooler rather than a passive solution, helping it remain compatible with conventional workstation designs. That matters for users who want high-end performance without building custom cases or cooling systems.

Still, the architecture has trade-offs. The card relies on x8 PCIe lanes per GPU, bifurcated from a x16 connector. This simplifies design and installation but limits bandwidth compared to full x16 cards.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Each GPU also includes just one DisplayPort and one HDMI output. That design choice keeps multi-GPU setups manageable and avoids hitting OS-level limits, older Windows versions, for example, may have trouble handling more than 32 active display outputs in a single system.

The card’s most intriguing feature may be its pricing. With single-GPU B60 cards reportedly starting around $375 MSRP, the dual-GPU version could land near $1,000.

If that estimate holds, Maxsun’s card would represent a major shift in value. For comparison, Nvidia’s RTX 6000 Ada, with the same 48GB of VRAM, sells for over $5,500. Two of those cards can push costs north of $18,000.

Even so, Intel’s performance in professional applications remains an open question. Many creative professionals still favor Nvidia for its mature drivers and better software optimization.

You might also like



Source link

May 26, 2025 0 comments
0 FacebookTwitterPinterestEmail
Nvidia's Jen-Hsun Huang on stage during the GTC 2025 keynote
Gaming Gear

Nvidia’s CEO says attempts to control chip exports to China are a failure: ‘If they don’t have enough Nvidia, they will use their own.’

by admin May 21, 2025



Attempts by the US government to put a cap on China’s development of AI technologies by limiting exports of GPUs has been a “failure”. So says no less an authority on the subject than Nvidia CEO, Jensen Huang.

The New York Times quotes Huang at the ongoing Computex show in Taipei, Taiwan denouncing GPU export controls. “AI researchers are still doing AI research in China,” Huang said on Wednesday. “If they don’t have enough Nvidia, they will use their own,” he said. All of which means, “the export control was a failure.”

He may have a point. But then Nvidia does rather have a dog in this fight. Huang himself says that restrictions on Nvidia’s H20 GPU will cost the company $15 billion in sales. So, it’s not hard to understand why he might prefer those limitations to be lifted.


You may like

Just for context, back in 2022 the former Biden administration imposed limits on the export of the most powerful GPUs from the US into China. Into the void left by restricted Nvidia exports has moved local outfit Huawei, whose GPUs currently do not match those of Nvidia for AI prowess. However, the fear is that the GPU export restrictions have only encouraged Huawei to put even more effort into closing the gap.

Indeed, according to the New York Times, Nvidia is concerned about just that, with an adjacent worry that, “any advantage gained by Huawei in China could eventually spread into other markets, helping Huawei build a stronger foundation from which to compete around the world.”

Computex 2025

(Image credit: Jacob Ridley)

Catch up with Computex 2025: We’re on the ground at Taiwan’s biggest tech show to see what Nvidia, AMD, Intel, Asus, Gigabyte, MSI and more have to show.

Meanwhile, it’s a little difficult to gauge Jensen Huang’s strategy and loyalties in all this. He recently appeared with other business leaders as a guest of the Trump administration in Saudi Arabia. But Nvidia has also just unveiled what will be a new Global headquarters in Taiwan, which doesn’t entirely square with the broader push to reshore tech manufacturing to the US.

Likewise, the New York Times reports that, “the day after the US government opened an investigation into whether Nvidia’s previous sales to China had violated its rules, Mr. Huang met with top economic and trade officials in Beijing.”

The plot, as they say, thickens. At the very least, it seems Huang and Nvidia are keeping their options fully open.



Source link

May 21, 2025 0 comments
0 FacebookTwitterPinterestEmail
Radeon AI Pro R9700
Gaming Gear

AMD launches Radeon AI Pro R9700 to challenge Nvidia’s AI market dominance

by admin May 21, 2025



AMD has been busy at Computex 2025, where the chipmaker unveiled the exciting Radeon RX 9060 XT and the Ryzen Threadripper 9000 series. To cap off its series of announcements, AMD is thrilled to introduce the Radeon AI Pro R9700, a PCIe 5.0 graphics card designed specifically for professional and workstation users.

RDNA 4 is an architecture geared towards gaming, but that doesn’t mean AMD can’t apply it to professional-grade graphics cards. For instance, RDNA 3 saw the mainstream Radeon RX 7000 series successfully coexisting with the Radeon Pro W7000 series. The same situation will occur with RDNA 4. AMD has already unveiled four RDNA 4-powered gaming graphics cards, yet the Radeon AI Pro R9700 is the first RDNA 4 professional graphics card to enter the market. The new workstation graphics card aims to replace the RDNA 3-powered Radeon Pro W7800, which has been faithfully catering to consumers since 2023.

The Radeon AI Pro R9700 utilizes the Navi 48 silicon. It’s currently the largest RDNA 4 silicon to date, with a die size of 357 mm² and home to 53.9 billion transistors. Navi 48 is also found in the Radeon RX 9070 series. It’s a substantially smaller silicon than the last-generation Navi 31 silicon, which is 529 mm² with 57.7 billion transistors. It’s nothing short of impressive that Navi 48 is roughly 33% smaller but still has 93% of the transistors of Navi 31.


You may like

Image 1 of 2

(Image credit: Future)(Image credit: Future)

Navi 48, a product of TSMC’s N4P (4nm) FinFET process node, adheres to a monolithic design. On the contrary, Navi 31 features an MCM (Multi-Chip Module) design, consisting of chiplets interconnected to a monolithic die. That’s the reason why Navi 31 is so enormous. The GCD (Graphics Complex Die) alone measures 304.35 mm², whereas each of the six MCDs (Memory Cache Die) is 37.52 mm².

With Navi 48, AMD returned to a monolithic die and, with N4P’s help, reduced the die size by 33%. Nonetheless, Navi 48 is up to 38% denser than Navi 31. The former has a density of 151 million transistors per mm², whereas the latter comes in at 109.1 million transistors per mm².

In terms of composition, the Navi 48 features 64 RDNA 4 Compute Units (CUs), which enable a maximum of 4,096 Streaming Processors (SPs). In contrast, the Navi 31 is equipped with 96 RDNA 3 CUs, for a total of 6,144 SPs. More CUs don’t necessarily mean more performance since RDNA 4 delivers considerable generation-over-generation performance uplift over RDNA 3.

AMD Radeon AI Pro R9700 Specifications

Swipe to scroll horizontally

Graphics Card

Radeon AI Pro R9700

Radeon Pro W7800

Architecture

Navi 48

Navi 31

Process Technology

TSMC N4P

TSMC N5 / N6

Transistors (Billion)

53.9

57.7

Die size (mm²)

357

529

SMs / CUs

64

70

GPU Shaders (ALUs)

4,096

4,480

Tensor / AI Cores

128

140

Ray Tracing Cores

64

70

Boost Clock (MHz)

?

2,525

VRAM Speed (Gbps)

?

18

VRAM (GB)

32

32 / 48

VRAM Bus Width

?

256-bit / 384-bit

L2 / Infinity Cache (MB)

?

64 ⁄ 96

Render Output Units

128

128

Texture Mapping Units

256

280

TFLOPS FP32 (Boost)

48

45.3

TFLOPS FP16 (INT4/FP4 TOPS)

96

90.5

Bandwidth (GB/s)

?

576 / 864

TBP (watts)

300

260 / 281

Launch Date

July 2025

April 2023

Launch Price

?

$2,499 / ?

AMD, being AMD as usual, didn’t reveal the Radeon AI Pro R9700’s entire specifications. However, the chipmaker did boast about the graphics card’s 128 AI accelerators, meaning it’s leveraging the full Navi 48 silicon. That means the Radeon AI Pro R9700 is rocking 4,096 SPs, 9% fewer than the Radeon Pro W7800. It also correlates to the former having 9% less AI accelerators. In the Radeon AI Pro R9700 ‘s defense, the CUs are RDNA 4, and the AI accelerators are second generation.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.

Regarding FP16 performance, the Radeon AI Pro R9700 peaks at 96 TFLOPS, 6% faster than the Radeon Pro W7800. AMD rates the graphics card with a 1,531 TOPS of AI performance.

AMD claims the Radeon AI Pro R9700 offers 2X improved performance over the Radeon Pro W7800 in DeepSeek R1 Distill Llama 8B. For some strange reason, AMD compared the Radeon AI Pro R9700 to the GeForce RTX 5080. Tested in a few large AI models, the Radeon AI Pro R9700 delivered up to 5X higher performance than the RTX 5080.

Image 1 of 9

(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)

The Radeon AI Pro R9700 is equipped with 32GB of GDDR6 memory. AMD has not disclosed the specifications regarding the speed of the memory chips or the width of the memory interface. Given that the Radeon Pro W7800 features 18 Gbps GDDR6, it is reasonable to conclude that the Radeon AI Pro R9700 should utilize memory chips with superior speed.

With 32GB of onboard memory, the Radeon AI Pro R9700 can tackle most AI models. It has the capacity of the Radeon Pro W7800, but not as much as the 48GB variant. The Radeon AI Pro R9700’s typical blower-type design will enable users to rock up to four of them inside a single system, such as AMD’s Ryzen Threadripper platform, which has good multi-GPU support. With four of them, users will have access to 128GB, more than enough for heavy models that exceed 100GB of VRAM usage.

Image 1 of 6

(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)(Image credit: AMD)

The Radeon AI Pro R9700 has a 300W TBP (Total Board Power). It’s 15% greater than the Radeon Pro W7800 32GB and 7% higher than the Radeon Pro W7800 48GB. Similar to most workstation-grade graphics cards, the Radeon AI Pro R9700 has the power connector at the rear. However, AMD has not indicated the type of power connector it employs, and it’s not visible in the provided renders. Considering the 300W rating, we would anticipate it to require two 8-pin PCIe power connectors. The Radeon AI Pro R9700 renders illustrate the graphics card featuring four DisplayPort outputs. Since it utilizes the RDNA 4 architecture, these outputs should conform to the 2.1a standard.

AMD has announced that the Radeon AI Pro R9700 will launch in July, but it has not revealed pricing details. In contrast, the Radeon Pro W7800 debuted at $2,499 two years ago and has maintained most of its value, currently priced at $2,399. We will soon learn the price of the Radeon AI Pro R9700 as its launch approaches in just a couple of months. AMD anticipates a healthy supply of the Radeon AI Pro R9700 from its partners, including ASRock, Asus, Gigabyte, PowerColor, Sapphire, XFX, and Yeston.

Follow Tom’s Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.



Source link

May 21, 2025 0 comments
0 FacebookTwitterPinterestEmail
Nvidia 5060
Product Reviews

Where to buy Nvidia’s RTX 5060 8GB GPU

by admin May 20, 2025



Following some very closely guarded previews, the new Nvidia RTX 5060 is now available to buy, delivering budget performance to users and just 8GB of VRAM starting at $299 (MSRP). It’s a 1080p card that promises decent framerates, but we’re yet to complete unfettered testing to determine where it ranks among the best GPUs and our overall GPU benchmarking hierarchy.

Unveiled in April, the 5060 follows the RTX 5060 Ti, which launched April 16 at prices of $429 and $379 for the 8GB or 16GB model, respectively. Like the Ti, the 5060 includes DLSS 4, including Multi Frame Generation and Super Resolution, as well as Nvidia Reflex. The drivers were released on May 19 alongside the card, which has precluded reviews going out ahead of release.

Over the weekend, select outlets published preview articles with strict criteria about which games could be tested and using which settings.

As you might imagine, these very favorable conditions yielded up to 25% performance boosts over Nvidia’s RTX 4060. Reportedly, Nvidia only sanctioned Avowed, Doom: The Dark Ages, Marvel Rivals, Cyberpunk 2077, and Hogwarts Legacy as review titles, with comparisons limited to the RTX 3060 and RTX 2060 Super, with resolution fixed at 1080p, ultra image quality, DLSS in quality mode, and ray tracing. Settings were also limited to running frame generation exclusively.

This has naturally yielded fairly positive results thus far for obvious reasons. From the available figures, the 5060 trails the Ti variant by around 15% on average when using 2x DLSS, but appears to show performance increases of up to 25% over the RTX 4060 running titles like Cyberpunk 2077.

Naturally, we’d recommend waiting for full reviews (including our own, which is on the way) before making the purchase. However, if you’d like to look at stock or have no qualms about taking the plunge, here’s where you can buy one.

Where to buy the Nvidia RTX 5060 in the US

Use our handy table to check what’s in stock and what models are available at which retailer. Check back daily as this list is update with the latest offers and pricing.

Click on the price, to be taken directly to retailer and model listed.

Swipe to scroll horizontally

Model

Retailer

Price

Stock

Asus Dual GeForce RTX 5060 8GB

Newegg

$299

Out of Stock

Asus Prime GeForce RTX 5060 8GB

Newegg

$299

Out of Stock

Asus Prime GeForce RTX 5060 OC 8GB

Newegg

$379

Out of Stock

Asus TUF Gaming GeForce RTX 5060 OC 8GB

Newegg

$409

Out of Stock

Gigabyte Aero GeForce RTX 5060 OC 8GB

Newegg

$349

In Stock

Row 5 – Cell 0

B&H Photo

$349

In Stock

Gigabyte Aorus Elite GeForce RTX 5060 8GB

Newegg

$359

Out of Stock

Row 7 – Cell 0

B&H Photo

$359

Out of Stock

Gigabyte Eagle GeForce RTX 5060 OC 8GB

Newegg

$329

In Stock

Row 9 – Cell 0

B&H Photo

$329

In Stock

Gigabyte Eagle Ice GeForce RTX 5060 Ti OC 8GB

Newegg

$329

Out of Stock

Gigabyte Gaming GeForce RTX 5060 OC 8GB

Newegg

$339

Out of Stock

Row 12 – Cell 0

B&H Photo

$339

Out of Stock

Gigabyte Low Profile GeForce RTX 5060 8GB

Newegg

$339

In Stock

Row 14 – Cell 0

B&H Photo

$339

Out of Stock

Gigabyte Windforce GeForce RTX 5060 8GB

Newegg

$299

Out of Stock

Row 16 – Cell 0

B&H Photo

$299

Out of Stock

Gigabyte Windforce GeForce RTX 5060 OC 8GB

Newegg

$319

In Stock

Row 18 – Cell 0

B&H Photo

$319

In Stock

MSI Gaming GeForce RTX 5060 OC 8GB

Newegg

$369

Out of Stock

Row 20 – Cell 0

B&H Photo

$369

Out of Stock

MSI Gaming Trio GeForce RTX 5060 OC 8GB

Newegg

$379

Out of Stock

Row 22 – Cell 0

B&H Photo

$379

Out of Stock

MSI Gaming Trio White GeForce RTX 5060 OC 8GB

Newegg

$409

Out of Stock

Row 24 – Cell 0

B&H Photo

$409

Out of Stock

MSI Inspire 2X OC GeForce RTX 5060 8GB

Newegg

$359

Out of Stock

Row 26 – Cell 0

B&H Photo

$359

Out of Stock

MSI Shadow 2X OC GeForce RTX 5060 8GB

Newegg

$299

In Stock

Row 28 – Cell 0

B&H Photo

$299

Out of Stock

MSI Ventus 2X OC GeForce RTX 5060 8GB

Newegg

$319

Out of Stock

Row 30 – Cell 0

B&H Photo

$319

Out of Stock

MSI Ventus 2X OC White GeForce RTX 5060 8GB

Newegg

$329

Out of Stock

Row 32 – Cell 0

B&H Photo

$329

Out of Stock

MSI Ventus 3X OC GeForce RTX 5060 8GB

Newegg

$349

Out of Stock

Row 34 – Cell 0

B&H Photo

$349

Out of Stock

PNY ARGB OC GeForce RTX 5060 8GB

Newegg

$349

Out of Stock

PNY Dual Fan OC GeForce RTX 5060 8GB

Newegg

$299

Out of Stock

Row 37 – Cell 0

B&H Photo

$299

Out of Stock

Zotac Amp GeForce RTX 5060 8GB

Newegg

$319

Out of Stock

Zotac Solo GeForce RTX 5060 8GB

Newegg

$299

Out of Stock

Zotac Twin Edge GeForce RTX 5060 OC 8GB

Newegg

$309

Out of Stock

Follow Tom’s Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.



Source link

May 20, 2025 0 comments
0 FacebookTwitterPinterestEmail
Only press who previewed the RTX 5060 under Nvidia’s test conditions are getting review drivers, reports claim
Game Updates

Only press who previewed the RTX 5060 under Nvidia’s test conditions are getting review drivers, reports claim

by admin May 20, 2025


In classic me fashion, I swanned off for a few days just as another graphics card fracas has spilled out into public view. At the centre this time is the previously unassuming RTX 5060, which you may have noticed is due for launch today yet only has a handful of “hands-on previews” to tell you how big of a graphics it does. Allegedly, that’s because Nvidia have been keeping hold of the drivers needed for full reviews, only providing them at the eleventh hour to press outlets that have previously run these previews. No preview? No review, at least until the drivers release publicly later today, and what’s more, the same reports say that these previews were only offered under strict testing provisos set by Nvidia themselves.

According to VideoCardz and Hardware Unboxed, the mandated test conditions supposedly range from only allowing certain games for benchmarking – judging from the previews currently online, these were Doom: The Dark Ages, Avowed, Cyberpunk 2077, Hogwarts Legacy and Marvel Rivals – to the more egregious demand that RTX 5060 performance figures would focus on DLSS 4’s Multi Frame Generation (MFG). And, in turn, would only be compared to results from older XX60 GPUs that lack DLSS frame gen support entirely.

“We worked with a few chosen media on previews with a pre-release driver,” an Nvidia spokesperson told me this afternoon. No comment on the review driver situation, other than a 5pm BST release time, was given.

Image credit: Rock Paper Shotgun

RPS was not invited to take part in these previews, and I can’t imagine agreeing to such terms if we were. Although it doesn’t appear that Nvidia required previewers to give positive RTX 5060 takes, with several highlighting the shortcomings of its 8GB VRAM limit, the limited game selection and emphasis on frame-genned performance versus the much older RTX 3060 and RTX 2060 Super are clearly intended to push a particular narrative: one that at best downplays the drawbacks of frame generation and at worst misleads readers with an unhelpfully narrow view of relative performance. GameStar, a German site that took Nvidia up on the offer, said in their preview that the GPU giant even specified the in-game settings that each game should be tested with.

The sense that a big, green thumb is pressing down on the critical scales is deepened by the alleged trading of earlier review drivers for a compliant preview. Even if, by that point, reviewers are free to use their own, independently-set benchmarks, the initial wave of RTX 5060 reviews will come from publications that Nvidia has – accurately or otherwise – deemed more friendly than others. Those who refused the locked-down previews, and have thus demonstrated less of a willingness to go along with the desired messaging, will be forced to wait before sharing impressions.

I can’t claim absolute moral superiority here because again, I wasn’t invited, and thus didn’t have the chance to send a “Thanks but no thanks” email (even I hadn’t simultaneously been too busy recovering from gin-assisted groomsman duty). Still, yeah, not a fan.

I have recently noticed Nvidia PRs becoming unusually pushy about how great it would be to test such and such frame generation in such and such game, but functionally those have only ever been suggestions, and I’ve never faced even a veiled hint at retribution for ignoring them in my reviews. Nonetheless, I now find myself in the bizarre position of having had physical possession of an RTX 5060 for nearly a week (posted by Zotac, with no strings attached other than to please not lose or break it), yet don’t have the software means to test or appraise it on the day of release. Like, man, at least Bethesda didn’t send us copies of Starfield while they were withholding the activation keys.

Watch on YouTube

More disturbing still is that this isn’t even the only accusation of editorial manhandling to be laid at Nvidia’s feet today. Big-deal tech YooToobers Gamers Nexus claimed in a video (above) that Nvidia have, with varying levels of subtlety, threatened to cut off their interview access to Nvidia engineering staff in response to a perceived lack of focus on DLSS and MFG performance testing in their reviews. Gamers Nexus have, in fact, produced multiple long-form vids on these topics specifically.

It isn’t unheard of for, nor technically outside the rights of, companies to pick and choose who gets primo access for coverage. In tech media especially, there may even be a minor, ethically unbothersome quid-pro-quo involved: attending a virtual briefing, for instance, in exchange for getting onto the review list. But there’s a honking great difference between asking journalists to sit through a thirty-minute slideshow and, essentially, demanding editorial jurisdiction over how their products are evaluated. Nvidia, one of the richest, most powerful firms on Earth, should know better – and should have at least had an idea that being caught fiddling with the independent review process might cause more damage to the RTX 5060 than a few variations of “It’s not much faster than the 4060, is it?”



Source link

May 20, 2025 0 comments
0 FacebookTwitterPinterestEmail

Categories

  • Crypto Trends (995)
  • Esports (749)
  • Game Reviews (692)
  • Game Updates (874)
  • GameFi Guides (985)
  • Gaming Gear (940)
  • NFT Gaming (967)
  • Product Reviews (930)
  • Uncategorized (1)

Recent Posts

  • Tom Lee Buys $45M In Ethereum As Bitmine Expands Treasury To $7B ETH
  • NYT Strands hints and answers for Monday, August 25 (game #540)
  • ‘Holy grail’ Jordan-Bryant sports card sells for record $12.9M
  • Silksong excitement has seen Hollow Knight’s concurrent Steam peak record smashed over and over again
  • The Framework Desktop and Linux have shown me the path to PC gaming in the living room

Recent Posts

  • Tom Lee Buys $45M In Ethereum As Bitmine Expands Treasury To $7B ETH

    August 24, 2025
  • NYT Strands hints and answers for Monday, August 25 (game #540)

    August 24, 2025
  • ‘Holy grail’ Jordan-Bryant sports card sells for record $12.9M

    August 24, 2025
  • Silksong excitement has seen Hollow Knight’s concurrent Steam peak record smashed over and over again

    August 24, 2025
  • The Framework Desktop and Linux have shown me the path to PC gaming in the living room

    August 24, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

About me

Welcome to Laughinghyena.io, your ultimate destination for the latest in blockchain gaming and gaming products. We’re passionate about the future of gaming, where decentralized technology empowers players to own, trade, and thrive in virtual worlds.

Recent Posts

  • Tom Lee Buys $45M In Ethereum As Bitmine Expands Treasury To $7B ETH

    August 24, 2025
  • NYT Strands hints and answers for Monday, August 25 (game #540)

    August 24, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 laughinghyena- All Right Reserved. Designed and Developed by Pro


Back To Top
Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop

Shopping Cart

Close

No products in the cart.

Close