Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop
Tag:

Nvidia

Decrypt logo
GameFi Guides

Nvidia, Tesla Stocks on Tron: xStocks Expand After Hitting Trading Milestone

by admin August 20, 2025



In brief

  • xStocks, a feature which allows users to traded tokenized versions of companies like Nvidia, Apple, and Meta, is expanding to Tron.
  • The tokenized stocks are backed 1:1 with real shares in the underlying companies.
  • xStocks so far have generated more than $500 million in on-chain trading volume.

Tokenized versions of Tesla, Nvidia, Apple, and other major stocks are now available for trading on the Tron network thanks to a collaboration between the Tron DAO, Kraken, and Backed—the firm behind xStocks, which offers global investors access to tokenized equities. 

The feature’s expansion comes shortly after xStocks eclipsed more than $500 million in on-chain trading volume to date, according to data from its official Dune dashboard. 

“Expanding xStocks to Tron, a network that settles over $20 billion daily, will significantly boost brand visibility and adoption,” Kraken Global Head of Consumer Mark Greenberg told Decrypt. “More importantly, it brings us closer to a fully permissionless, borderless, and interoperable market where anyone can trade tokenized equities around the clock.”



Backed’s xStocks feature allows users to gain exposure to American equities via on-chain tokens that are 1:1 backed with actual shares in each respective company. Previously available to users on Solana and BNB Chain, and offered via centralized exchanges like Kraken, these tokens will now be available via TRC-20 tokens on Tron. 

“This collaboration highlights how Tron’s decentralized network can bring tokenized equities into a more open, transparent, and accessible environment,” said Tron founder Justin Sun, in a statement. 

“Tokenized equities represent a natural evolution for crypto, bridging traditional markets with blockchain,” he added. “As demand for popular equities meets a global base of previously excluded users, we’ll see a more efficient, flexible, and accessible market.”

To date, the network is arguably best known for its substantial stablecoin usage, which has grown by about $23 billion in the last year according to data from DefiLlama. The network holds $82.8 billion in stablecoins—primarily Tether’s USDT—trailing only Ethereum, which maintains $143 billion in stablecoins at the time of writing.

xStocks have generated more than $2.9 billion in volume between centralized exchanges and their decentralized counterparts, though the split falls heavily in favor of centralized exchanges which have accounted for more than 95% of the volume according to the official xStocks Dune dashboard. Over $500 million of that trading has taken place on-chain, per the dashboard.

The tokenized equities currently account for around $46.4 million in assets under management, more than 20% of which can be attributed to tokenized shares in Elon Musk’s Tesla (TSLA). 

As for bigger goals, Backed is focused on continuing to expand its xStocks product offering.

“Our priority is expanding the xStocks Alliance—bringing more partners, blockchains, and applications onboard as we deliver the gold standard in tokenized equities offerings,” Greenberg said. “That’s where the real growth happens.”

More features, chain integrations, and new offerings are expected in the coming weeks, he added. 

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.



Source link

August 20, 2025 0 comments
0 FacebookTwitterPinterestEmail
‘Play Instantly on Discord’: Fortnite will be Nvidia and Discord’s first instant game demo
Gaming Gear

‘Play Instantly on Discord’: Fortnite will be Nvidia and Discord’s first instant game demo

by admin August 18, 2025


Nvidia’s GeForce Now is getting a big upgrade next month — and it’s also part of an intriguing new experiment. Nvidia, Discord, and Epic Games have teamed up for an early test of instant game demos for Discord servers, which could theoretically let you immediately try a game without buying it, downloading it, or signing up for an account.

“You can simply click a button that says ‘try a game’ and then connect your Epic Games account and immediately jump in and and join the action, and you’ll be playing Fortnite in seconds without any downloads or installs,” says Nvidia product marketing director Andrew Fear.

Here’s a screenshot of what it might look like, from an Nvidia video, which also shows the Fortnite demo is currently limited to a 30-minute free trial:

It doesn’t sound completely frictionless if you still need an Epic Games account to play, and it’s not clear if Nvidia, Epic and Discord will offer the demo outside of Gamescom just yet. Nvidia is calling it a “technology announcement” rather than a confirmed feature, one that’ll hopefully see game publishers and developers reach out if they’re interested in potentially adding it to their games.

After Sony bought Gaikai in 2012, it initially suggested it would offer instant try-before-you-buy game demos on the PlayStation 4 too, but that never happened. Years later, Gaikai’s founder told me that publishers didn’t necessarily want it.



Source link

August 18, 2025 0 comments
0 FacebookTwitterPinterestEmail
SkyDefense CobraJet C-UAS fighter drone and interceptor
Product Reviews

CobraJet Nvidia AI-powered drone killer takes out ‘overwhelming enemy drone incursions’ at up to 300mph

by admin August 18, 2025



Defense startup SkyDefense LLC just launched an autonomous combat drone designed to take out enemy drone swarms at a much lower cost than traditional weapon systems. The company calls it the CobraJet — an uncrewed aerial vehicle (basically, a drone) designed for C-UAS (counter-unmanned aircraft system) missions. The drone combines Teledyne FLIR electro-optical and infrared sensors that do not contain restricted foreign parts, and Nvidia AI chips, allowing the drone to process the information that it sees with onboard sensors.

A different kind of VRAM

The CobraJet is also equipped with its proprietary Visual Realtime Area Monitoring (VRAM) system, allowing ground commanders to monitor the drone during autonomous operations and communicate with and control it, if needed. This gives its operator the option to let it operate on its own during reconnaissance, patrol, and identification, but still have a human making decisions when required. It can also use the same technology to communicate with other CobraJet units, allowing them to act together as a single entity to protect against enemy swarms.

Aside from its AI brain, the CobraJet also boasts an internal weapons bay and external hardpoints, allowing it to carry kamikaze drones, small missiles, or even fragmentation projectiles. It can also be modified to carry precision bombs and loitering munitions, making it a multirole drone. Its external design mimics that of the U.S.’s latest air superiority and multirole fighters, the F-22 Raptor and F-35 Lightning II, with vertical take-off and landing capabilities and thrust vectoring nozzles. This means it can operate from the back of a truck and have improved maneuverability, allowing it to go toe-to-toe with small and nimble drones.


You may like

Asymmetric warfare answered?

CobraJet is SkyDefense’s solution to the emerging threat of drone swarms on the modern battlefield. These small and cheap weapons are widely used in Russia’s invasion of Ukraine, with the defending Ukrainians effectively using drones to initially counter the larger Russian army. Today, both sides in the conflict use UAVs, and actions on the battlefield highlight the U.S.’s need to develop a cost-effective counter.

While existing weapons like surface-to-air missiles and air-to-air missiles can engage drones, there’s often a huge mismatch in price between these two platforms. Missiles often cost between half a million to more than $4 million — while you can buy a cheap drone for just $200, with the more sophisticated ones, like Iran’s Shahed-136, only costing $20,000. You can also send up a platoon of combat choppers to engage a drone swarm with guns, but you’re risking several multi-million-dollar weapon platforms to combat cheap suicide drones.

(Image credit: SkyDefense LLC)

“Our USA-made CobraJets can communicate and coordinate as a flight team, enabling them to operate as an AI-powered unmanned Air Force,” said SkyDefense LLC President Nick Verini. “This team approach increases the effectiveness of the squadron while also significantly reducing the costs of destroying a swarm of enemy drones.”

SkyDefense LLC hasn’t released the unit cost of the CobraJet, but it’s going to be so much more affordable than the fighter jets it looks like, and the missiles they carry. The company is offering the drone to law enforcement, Homeland Security, and the U.S. military, giving them the ability to protect against hostile drone swarms without needing to spend copious amounts of money to take down such cheap weapons.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.



Source link

August 18, 2025 0 comments
0 FacebookTwitterPinterestEmail
There’s a desktop-grade Nvidia RTX 5050 after all, and it’s out next month
Game Updates

There’s a desktop-grade Nvidia RTX 5050 after all, and it’s out next month

by admin June 25, 2025


Yesterday’s launch of the Nvidia GeForce RTX 5050 laptop GPU was quiet, yet nonetheless featured a surprise: there’s also a desktop version of the RTX 5050, set to release (exclusively in third-party garb – there’s no Founders Edition) in “the second half” of July. Quite the turnaround for the non-laptop XX50 lineage, which was widely assumed extinguished following 2022’s RTX 3050.

The RTX 5050 will replace the RTX 5060 as the cheapest of Nvidia’s current-gen desktop GPUs, starting at $249. Budget buyers will need to make do with 8GB of last-gen GDDR6 VRAM, however, as well as a lighter smattering of CUDA cores: 2,560 to the RTX 5060’s 3,840. There’s obviously no desktop RTX 4050 to compare these specs to, though next to the RTX 3050, it does offer more memory bandwidth even if the total gigabyte count remains the same. Presumably there are also performance advantages, especially where ray tracing is concerned, that the RTX 5050 can glean from its updated Blackwell architecture.

‘Course, while the RTX 5050 might make for a decent upgrade for current 3050 owners – especially those interested in DLSS 4 and frame generation – its $249 price tag does keep it nervily close to the $300 RTX 5060. Which, even with its driver issues, really isn’t a bad 1080p card itself, so we’ll have to see what the 5050 can do to coexist with its more core-rich, wider-bandwidth big brother.

I’ll admit, there’s a not-small part of me that hopes it can. We can raise eyebrows at the specifications and make “Fnar fnar new GTX 1030” jokes until our typing fingers are worn down to stumps, but there genuinely are vast numbers of PC players who are happy with below-max quality 1080p yet can’t reliably stretch to mid-rangers like the XX60 series. People, in other words, who haven’t been adequately served by the GPU market in years. Arguably since even before the RTX 3050, what with that GPU launching during the dark times of COVID-compounded shortages and price gouging.

Will the RTX 5050 fill that gap? I won’t know until I’ve tested one out. But I’d rather it ends up as a failed attempt, than a repeat of the RTX 40 series’ failing to try at all.



Source link

June 25, 2025 0 comments
0 FacebookTwitterPinterestEmail
$200 GPU Face-off: Nvidia vs AMD vs Intel
Product Reviews

$200 GPU face-off: Nvidia RTX 3050, AMD RX 6600, and Intel Arc A750 duke it out at the bottom of the barrel

by admin June 21, 2025



It’s a tough time to be a gamer on a tight budget. The AI boom has made fab time a precious resource. There’s no business case for GPU vendors to use their precious TSMC wafers to churn out low-cost, low-margin, entry-level chips, much as we might want them to.

The ever-shifting tariff situation in the USA means prices are constantly in flux. And ever-increasing VRAM requirements mean that the 4GB and 6GB graphics cards of yore are being crushed by the latest games. Even if you can still find those cards on shelves, they’re not smart buys.

So what’s the least a PC gamer can spend on a new graphics card these days and get a good gaming experience? We tried to find out.


You may like

We drew a hard line at cards with 8GB of VRAM. Recent graphics card launches have shown that 8GB is the absolute minimum for gamers who want to run modern titles at a 1080p resolution.

PC builders in this bracket aren’t going to be turning on Ray Tracing Overdrive mode in Cyberpunk 2077, or RT more generally, which is where VRAM frequently starts to become a true limit. Even raster games can challenge 8GB cards at 1080p with all settings maxed, though.

We also limited our search to modern cards that support DirectX 12 Ultimate. You might find a cheap GPU out there with 8GB of VRAM, but if it doesn’t support DirectX 12 Ultimate, it’s truly ancient.

Within those constraints, we found three potentially appealing options, all around the $200 mark. The Radeon RX 6600 is available for just $219.99 at Newegg right now in the form of ASRock’s Challenger D model. Intel’s Arc A750 can be had for $199.99, also courtesy of ASRock. Finally, the GeForce RTX 3050 8GB is still hanging around at $221 thanks to MSI’s Ventus 2X XS card. We pitted this group against each other to find out whether any of them are still worth buying.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.

Raster gaming performance

We whipped up a quick grouping of a few of today’s most popular and most advanced games at 1080p and high or ultra settings without upscaling enabled, along with a couple older titles, to get a sense of how these cards still perform. We also did 1440p tests across a mix of medium and high settings (plus upscaling on Alan Wake 2) to see how these cards handled a heavier load.

Image 1 of 20

(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Toms Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)

The Arc A750 consistently leads in our geomean of average FPS results at 1080p. It’s 6% faster than the RX 6600 overall and 22% faster than the RTX 3050. At 1440p, the A750 leads the RX 6600 by 18% and the RTX 3050 by 25%.

The Arc A750 also leads the pack in the geomean of our 99th-percentile FPS results. It delivered the smoothest gaming experience across both resolutions.

Some notes from our testing: Alan Wake 2 crushes all of these cards, and you’re going to want some kind of upscaling to make it playable. Given the option, we’d also turn Nanite and Lumen off in any Unreal Engine 5 title that supports them, as they either tank performance (in the case of the RTX 3050 and A750) or introduce massive graphical errors (as seen on the Radeon RX 6600 in Fortnite).

Image 1 of 2

There’s supposed to be ground there… (Image credit: Tom’s Hardware)There’s supposed to be ground there… (Image credit: Tom’s Hardware)

The major Fortnite graphics corruptions we saw on the RX 6600 have been reported for months across multiple driver versions on all graphics cards using Navi 23 GPUs, not just on the RX 6600, and it’s not clear why AMD or Epic hasn’t fixed them. The RX 6600 is also the single most popular Radeon graphics card in the Steam hardware survey, so we’re surprised this issue is still around. We’ve brought it up with AMD and will update this article if we hear back.

⭐ Winner: Intel

Ray tracing performance

Let’s be blunt: don’t expect a $200 graphics card to deliver acceptable RT performance. 8GB of VRAM isn’t enough to enable the spiffiest RT effects in today’s titles; the visual payoff usually isn’t worth the performance hit, and enabling upscaling at 1080p generally compromises visual quality, even as it claws back some of that lost performance. It’s better to put other priorities first (or to save up for a more modern, more powerful graphics card).

Image 1 of 3

(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)(Image credit: Tom’s Hardware)

Even with those cautions in mind, we were surprised to see that the Arc A750 can still deliver a reasonably solid experience with RT on in older titles. Doom Eternal still runs at high frame rates with its sole RT toggle flipped on, and Cyberpunk 2077 offers a solid enough foundation for enabling XeSS at 1080p and medium RT settings if you’re hell-bent on tracing rays.

Black Myth Wukong overwhelms the A750 even with low ray tracing settings and XeSS Balanced enabled, though, so performance tanks. XeSS also introduces plenty of intrusive visual artifacts that make it unusable in this benchmark, and the game’s FSR implementation is no better. It’s modern RT titles like this where 8GB cards like the A750 are most likely to end up struggling.

The RTX 3050 does OK with the relatively light RT load of Doom Eternal, but it can’t handle Cyberpunk 2077 well enough to create a good foundation for upscaling, and Black Myth Wukong is also out of the question.

The RX 6600 has the least advanced and least numerous RT accelerators of the bunch, so its performance lands it way at the back of the pack.

⭐ Winner: Intel

Upscaling

The RTX 3050 is the only card among these three that can use Nvidia’s best-in-class DLSS upscaler, which recently got even better in some games thanks to the DLSS 4 upgrade and its transformer-powered AI model. DLSS is an awesome technology in general, and Nvidia claims that over 800 games support it; however, the performance boost it offers on the RTX 3050 isn’t particularly great. This is not that powerful a GPU to begin with, and multiplying a low frame rate by a scaling factor just results in a slightly less low frame rate.

Four years after its introduction, some version of AMD’s FSR is available in over 500 games, and it can be enabled on virtually every GPU. That ubiquity is good news for the RX 6600 (and everybody else), but there’s a catch: FSR’s delivered image quality so far has tended to be worse than DLSS and XeSS. The image quality gap appears set to close with FSR 4, but the Radeon RX 6600 won’t get access to that tech. It’s reserved for RX 9000-series cards only.

Intel’s XeSS upscaler can be enabled on graphics cards from any vendor if a game supports it, although the best version of the XeSS model only runs on Arc cards. XeSS is available in over 200 titles, so even though it’s not as broadly adopted as DLSS or FSR, it’s fairly likely you’ll find it as an option. We’d prefer it over FSR on an Arc card where it’s available, and you should try it on Radeons to see if the results are better than AMD’s own tech.

⭐ Winner: Nvidia (generally), AMD (in this specific context)

Today’s best Intel Arc A750, AMD Radeon RX 6600 and Nvidia RTX 3050 deals

Frame generation

The RTX 3050 doesn’t support DLSS Frame Generation at all. If you want to try framegen on this card, you’ll have to rely on cross-vendor approaches like AMD’s FSR 3 Frame Generation.

Intel’s Xe Frame Generation comes as part of the XeSS 2 feature set, and those features are only baked into 22 games so far. Unless one of your favorite titles already has XeSS 2 support, it’s unlikely that you’ll be able to turn on Intel’s framegen tech on your Arc card. As with the RTX 3050, your best shot at trying framegen comes from AMD’s FSR 3.

AMD’s FSR Frame Generation tech comes as part of the FSR 3 feature set, which has been implemented in 140 games so far. As we’ve noted, FSR 3 framegen is vendor-independent, so you can enable it on any graphics card, not just the RX 6600.

AMD’s more basic Fluid Motion Frames technology also works on the RX 6600, but only in games that offer an exclusive fullscreen mode. Since Fluid Motion Frames is implemented at the driver level, it lacks access to important motion vector information that FSR3 Frame Generation gets. FMF should be viewed as a last resort.

⭐ Winner: AMD

Power

The RTX 3050 is rated for 115W of board power, but it doesn’t deliver particularly high performance to go with that rating. It’s just a low-power, low-performance card.

The Radeon RX 6600 delivers the best performance per watt in this group with its 132 W board power. It needs 15% more power than the RTX 3050 to deliver about 14% more performance at 1080p.

Intel’s Arc A750 needs a whopping 225 W to deliver its strong performance in gaming, or nearly 100W more than the RX 6600. That’s 70% more power for just 6% higher performance at 1080p, on average. Worse, Intel’s card also draws much more power at idle than either the RX 6600 or A750 without tweaking BIOS and Windows settings to mitigate that behavior.

⭐ Winner: AMD

Drivers and software

Nvidia’s Game Ready drivers reliably appear alongside the latest game releases, and Nvidia has a history of quickly deploying hotfixes to address specific show-stopping issues. Users have reported that Nvidia’s drivers have been getting a little shaky alongside the release of RTX 50-series cards, though, and we’ve seen evidence of that same instability in our own game testing.

Games aren’t the only place where drivers matter. Nvidia’s massive financial advantage over the competition means that non-gamers who still need GPU acceleration, like those using Adobe or other creative apps, can generally trust that their GeForce card will offer a stable experience with that software.

The Nvidia App (formerly GeForce Experience) includes tons of handy features, like one-click settings optimization and game recording tools. Nvidia also provides useful tools like Broadcast for GeForce RTX owners free of charge. We don’t think you should pick the RTX 3050 for gaming on the basis of Nvidia’s drivers or software alone, though.

Intel has kept up a regular pace of new driver releases with support for the latest games, although more general app support may be a question mark. Intel Graphics Software has a slick enough UI and an everything-you-need, nothing-you-don’t feature set for overclocking and image quality settings. We wouldn’t choose an Arc card on the basis of Intel’s software support alone, but the company has proven its commitment to maintaining its software alongside its hardware.

AMD releases new graphics drivers on a monthly cadence, but some big issues may be getting through QA for older products like the RX 6600. Even in the limited testing we did for this face-off, we saw show-stopping rendering bugs in the latest version of Fortnite with Nanite virtualized geometry enabled. Users have been complaining of this issue for months, and it seems widespread enough that someone should have noticed by now.

The AMD Software management app boasts a mature, slick interface and useful settings overlay, along with plenty of accumulated features like Radeon Chill that some enthusiasts might find handy.

⭐ Winner: Nvidia

Accelerated video codecs

You probably don’t need a $200 discrete GPU for video encoding alone. If you already have a modern Intel CPU with an integrated graphics processor, you can already get high-quality accelerated video encoding and decoding without buying a discrete GPU.

That said, if you don’t have an Intel CPU with integrated graphics and you must have a high-quality accelerated video codec for transcoding, the RTX 3050 could be worth it as a light-duty option. If NVENC is all you want or need, though, the even cheaper (and less powerful) RTX 3050 6GB can be had for a few bucks less.

The Arc A750’s video engine supports every modern codec we’d want, and it offers high quality and performance. The high power requirements of the A750 (even at idle and under light load) make it unappealing for use in something like a Plex box, though. If accelerated media processing is all you need, you can still pick up an Arc A380 for $140.

The less modern accelerated video codec on the Radeon RX 6600 (and in Ryzen IGPs) produces noticeably worse results than those of AMD or Intel. It works fine in a pinch, but you will notice the lower-quality output versus the competition. If you’re particular about your codecs, look elsewhere.

⭐ Winner: Two-way tie (Intel and Nvidia)

Virtual reality

While VR hasn’t changed the world as its boosters once promised it would, the enduring popularity of apps like Beat Saber and VRChat means that we should at least give it a cursory look here.

The RTX 3050 and Radeon RX 6600 technically support basic VR experiences just fine, although you may find their limited power requires enabling performance-boosting tech like timewarp and spacewarp to get a comfortable experience.

Intel doesn’t support VR HMDs on the Arc A750 (or any Arc card at all, for that matter), so it’s a total no-go if you want to experience VR on your PC.

⭐ Winner: Two-way tie (AMD and Nvidia)

Bottom line

Swipe to scroll horizontallyHeader Cell – Column 0

AMD RX 6600

Nvidia RTX 3050 8GB

Intel Arc A750

Raster Performance

Row 0 – Cell 1 Row 0 – Cell 2

❌

Ray Tracing

Row 1 – Cell 1 Row 1 – Cell 2

❌

Upscaling

❌

Row 2 – Cell 2 Row 2 – Cell 3

Frame Generation

❌

Row 3 – Cell 2 Row 3 – Cell 3

Power

❌

Row 4 – Cell 2 Row 4 – Cell 3

Drivers

Row 5 – Cell 1

❌

Row 5 – Cell 3

Accelerated Codecs

Row 6 – Cell 1

❌

❌

Virtual reality

❌

❌

Row 7 – Cell 3

Total

4

3

3

Let’s be frank: it’s a rough time to be buying a “cheap” graphics card for gaming. To even touch a modern GPU architecture, you need to spend around $300 or more. $200 is the bottom of the barrel.

8GB of VRAM is a compromise these days, but our experience shows that you can get by with it at 1080p if you’re willing to tune settings. It isn’t reasonable to slam every slider to ultra and expect a good experience here. Relax some settings, enable upscaling when you need it, and you can still have a fun time at 1080p with just two Franklins in your wallet.

So who’s our winner? Not the GeForce RTX 3050. This card trails both the Radeon RX 6600 and Arc A750 across the board. You can’t enable DLSS Frame Generation on the RTX 3050 at all, and we’re not sure that getting access to the image quality of GeForce-exclusive DLSS 4 upscaling is worth dealing with this card’s low baseline performance. Unless you absolutely need a specific feature or capability this card offers, skip it.

Even four years after its launch, the Radeon RX 6600 is still solid enough for 1080p gaming. It trailed the Arc A750 by about 6% on average at 1080p (and about 15% at 1440p).

If it weren’t for this performance gap, the RX 6600’s strong showing in other categories would make it our overall winner. But not every win carries the same weight, and performance matters most of all when discussing which graphics card is worth your money.

That said, the RX 6600’s performance per watt still stands out. It needs 90 W less power than the A750 to do its thing, and it’s well-behaved at idle, even with a 4K monitor. If you have an aging or weak PSU, the RX 6600 might be your upgrade ticket.

AMD’s widely adopted and broadly compatible FSR upscaling and frame generation features help the RX 6600’s case, but they also work on the RTX 3050 and A750, so it’s kind of a push. The only real downsides to the RX 6600 are its dated media engine and poor RT performance. We also saw troubling graphical glitches in titles as prominent as Fortnite on this card that we didn’t experience on the Intel or Nvidia competition.

That leaves us with the Arc A750. This card delivers the most raw gaming muscle you can get for $200 at both 1080p and 1440p, but it comes with so many “buts.” Its high power requirements might make gamers with older or lower-end PSUs think twice. Intel’s graphics driver can be more demanding on the CPU than the competition, meaning older systems might not be able to realize this card’s full potential. And older systems that don’t support Resizable BAR won’t work well with the A750 at all.

Our experience shows that the A750 can stumble with Unreal Engine 5’s Lumen and Nanite tech enabled, and not every game exposes them as a simple toggle like Fortnite does. More and more studios are using UE5 as the foundation for their games, so there’s a chance this card could underperform in future titles in spite of its still-strong potential.

If you can’t spend a dollar more than $200 and you don’t mind risking the occasional performance pitfall in exchange for absolute bang-for-the-buck, the Arc A750 is still worth a look. If you want a more mature, well-rounded card, the Radeon RX 6600 is also a good choice for just a few dollars more. But if you have the luxury of saving up enough to get even an RTX 5060 at $300, we’d think long and hard about spending good money to get an aging graphics card.

Bottom line: None of these cards could be described as outright winners. Intel, AMD, and Nvidia all have plenty of opportunity to introduce updated GPUs with modern architectures in this price range, but there are no firm signs that any of them plan to (at least on the desktop). Until that happens, PC gamers on strict budgets will have to pick through older GPUs like these on the discount rack when buying new, or hold out for a used card with all its attendant risks.



Source link

June 21, 2025 0 comments
0 FacebookTwitterPinterestEmail
A CPU cooler mounted on an Nvidia GTX 960
Product Reviews

Crazed modder straps CPU cooler to Nvidia GTX 960 with a 3D-printed bracket, breaks 3DMark benchmark record

by admin June 4, 2025



A YouTube modder and Redditor has successfully attached a CPU cooler to an Nvidia GTX 960 using a 3D-printed bracket, bringing temperatures compared to the stock cooler down by 10 degrees and breaking a 3DMark Fire Strike benchmark record in the process.

The daring tinkerer, whose modest YouTube channel trades under the name TrashBench, took to the platform to reveal how they used a CPU cooler, some zip ties, and eventually a 3D-printed bracket to create one of the most novel yet surprisingly effective GPU cooling solutions we’ve ever seen.

“Had a spare CPU cooler and figured I’d chuck it on my 960 for a laugh,” they revealed on Reddit. The first solution? Remove the GTX 960’s stock cooler and simply strap the CPU to the card using some zip ties. The video reveals a precariously poised Cooler Master heatsink bound by some luminous yellow cable ties, a hilariously rudimentary first attempt.


You may like

I tried zip-tying a CPU cooler to my GTX 960. It got hotter. So I made a mount. Now it’s colder than stock. from r/hardware

“It looked dumb, ran hot, and nearly rattled itself apart,” TrashBench reveals. A second run of 3D Mark’s Fire Strike test yielded temperature increases of more than 10 degrees, likely caused by poor contact with the GPU’s heatsink.

Undeterred, TrashBench fired up the 3D printer and “whipped up a proper bracket.” The third run with the 3D-printed mount finally showcased the awesome power of the CPU cooler when properly in contact with the GPU, delivering a 13-degree improvement over the GTX 960’s stock cooler and a 20-degree improvement on the cable tie attempt.

What’s more, the monstrous combination even broke the 3DMark Fire Strike record for GTX 960 and Intel Core i5-12600KF processors, with a new top overall score of 7642, beating out the previous record of 7458. Not only that, the CPU cooler solution runs quieter than the GTX 960’s stock cooler, but TrashBench puts this down to the “trash” fan running at low speed.

(Image credit: 3DMark)

While the hardware combination is something of an eyesore, the actual physics behind the performance makes perfect sense. GPU coolers, by nature, have a much lower profile than CPU air coolers, which are equipped with fins and can measure upwards of 15 centimeters in height. TrashBench confirmed the CPU cooler in play weighed 470 grams, compared to the 270-gram factory cooler taken from the GTX 960, a battle of physics with only one winner when it comes to shifting heat.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.

And TrashBench isn’t done. “I think I’ll have to try it on a 4080 next,” they quipped at the end of the video. Engaging further on Reddit, TrashBench also agreed that a GPU with a more sophisticated cooling system (the 960 is from 2015, after all) might not yield such a drastic improvement, promising to try a more potent cooler on a 2070 Super. In fact, TrashBench is even considering more thorough benchmarking to weigh stock GPU coolers, comparing them directly with similar-sized CPU coolers for fairer testing, a project that might even give our GPU Benchmarks Hierarchy a run for its money!

You can watch the full two-minute video below.

Follow Tom’s Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.



Source link

June 4, 2025 0 comments
0 FacebookTwitterPinterestEmail
Nvidia CEO Jensen Huang
Product Reviews

Nvidia CEO says Trump’s tariff plan is ‘utterly visionary’

by admin May 30, 2025



Nvidia CEO Jensen Huang made himself available for media interviews in the wake of the publishing of its record $44 billion revenue financials and the traditional analysts’ call. Bloomberg has shared its on-air Q&A session with Huang, in which the Nvidia boss was questioned about the impact of U.S. policy on his company’s recent revenue. However, when specifically addressing President Trump’s tariffs and decision to rescind the AI Diffusion Rule, Huang couldn’t have been more extravagant with his personal praise.

After Huang’s gentle criticism of U.S. policy regarding exports of technologies that could accelerate AI, the Bloomberg journalist got more specific about the U.S. administration’s policies. Specifically, he asked the Nvidia CEO about his trust of President Trump, and the direction things were going in.

“Obviously, I don’t know all of his ideas, but let me tell you about two that are incredible,” answered Huang. “The first one is utterly visionary. The idea of tariffs being a pillar of a bold vision to re-industrialize to onshore manufacturing and motivate the world to invest in the United States is just an incredible vision. I think this is going to be a transformative idea for the next century for us, explained the Nvidia CEO. “We’re all in on the idea. We’re setting up plants and encouraging our partners from around the world to invest in the United States, and we have a lot of stuff going on, and so I’m very excited about that.”


You may like

Some more outpourings of praise followed. Huang was perhaps hoping to influence any forthcoming decision(s) which would fill the vacuum caused by the AI diffusion rule being rescinded. “The second major idea is to rescind the AI diffusion role, recognizing that this isn’t about limiting American technology, but this is about accelerating American stacks around the world to make sure that, before it’s too late that the world builds on American stacks during this extraordinary time, the AI era.”

Yet more praise was lavished on the U.S. president to underline Huang’s admiration. He ended this segment by saying, “These two initiatives are completely visionary, and it’s going to be transformative for America.”

Nvidia CEO Jensen Huang Interview| Bloomberg Technology Special – YouTube

Watch On

The interview with Huang also covered how Nvidia successfully made up for lost China revenue streams. The Nvidia CEO snappily replied, “We have a whole bunch of engines firing right now,” illustrating the appeal of a broad base and wide customer portfolio. He also took the opportunity to blow the Nvidia tech trumpet, adding, “people realize that Blackwell is a home run.”

Still on the topic of China, Huang lamented the loss of U.S. influence in the AI industry there. He reminded the interviewer that the Chinese market is very important for its absolute size, and that it is still home to maybe 50% of the world’s AI researchers. Naturally, developers there are pivoting to Huawei, for example. That’s an “unfortunate part of changing policy,” said Huang, but he hoped things would change so U.S. tech could again become a desirable standard.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.

(Image credit: Tom’s Hardware)

Later in the interview, the Nvidia CEO was asked about immigration and the tech industry in the U.S. Naturally, Huang was all for streamlining the inflow of talented engineers and scientists. He also took the opportunity to heap praise on Elon Musk. The Nvidia CEO described Musk as an “extraordinary engineer” who was stewarding “revolutionary companies.”

The Bloomberg interview ended with some talk about Europe. Huang will be seeing lots of heads of state and companies across Europe in the coming week. AI is going to be part of the national infrastructure like electricity, or the internet – and Europe wants to embrace this idea, it seems.

Follow Tom’s Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.



Source link

May 30, 2025 0 comments
0 FacebookTwitterPinterestEmail
Nvidia Q1 Revenue Beats, Earnings Miss Due To China Export Curbs
Crypto Trends

Nvidia Q1 Revenue Beats, Earnings Miss Due to China Export Curbs

by admin May 29, 2025



The US tech giant Nvidia, which makes advanced artificial intelligence (AI) computer chips, has shared its financial results for the first three months of its 2026 fiscal year, which ended on April 27, 2025. The company has made $44.1 billion in sales. This amount represents a 12% increase from the previous three months and a 69% increase from the same period last year. 

The revenue beat what experts at Zacks predicted, which was $42.91 billion, by about 2.7%. However, Nvidia’s profit per share was 81%, slightly less than the 85% that was expected by the experts. The firm’s total profit was $18.8 billion, which is 26% more than the previous year.

The company’s profits were lower than its expectations because it had to set aside $4.5 billion due to U.S. government rules that blocked the sale of the firm’s powerful H20 AI chips to China. Nvidia’s CEO, Jensen Huang, said that the demand for their AI technology is very strong worldwide, comparing it to essential services like electricity and the internet, which everyone needs.

As per the reports, the firm expects to earn about $45 billion in sales for the next three months, but it predicts losing $8 billion of that because the U.S. government has restricted sales of its high-powered H20 AI chips to China. To work around this, Nvidia is creating a new, cheaper AI chip for the Chinese market and plans to start making it in large quantities in June 2025.

Nvidia’s data center segment drove the bulk of its revenue, contributing $39.1 billion, a 10% rise from the previous quarter. Despite the earnings miss, investor confidence remained strong, with Nvidia’s stock (NVDA) climbing 4.89% to $141.40 in after-hours trading on May 28, after closing down 0.51% at $134.81.

While Nvidia deals with restrictions on selling chips to China, other U.S. tech giants are focusing more on artificial intelligence (AI). Microsoft is building two new AI research centers in Abu Dhabi to advance its AI technology. 

At the same time, some companies that used to focus on Bitcoin mining are now using their equipment to help run powerful AI programs, like those that understand and generate human-like text. This highlights that many companies are shifting their efforts to build and support AI technology.

Also Read: Fact Check: Is NVIDIA Adding Bitcoin to Its Balance Sheet?



Source link

May 29, 2025 0 comments
0 FacebookTwitterPinterestEmail
Micron
Product Reviews

Nvidia posts record $44 billion revenue, H20 export ban bites as gaming rises

by admin May 29, 2025



Nvidia on Wednesday disclosed its financial results for the first quarter of its fiscal 2026, posting revenue of $44.062 billion, its best quarter ever. The company’s sales increased almost across the board both in terms of quarter-over-quarter and year-over-year comparisons. As the company ramped up its Blackwell GPUs, it also set revenue records both for gaming and datacenter revenues. However, the recent shipments ban of H20 GPUs to China hurt Nvidia’s margins quite significantly.

Record quarter

For the first quarter of fiscal 2026, Nvida reported GAAP revenue of $44.062 billion, marking a 12% rise quarter-over-quarter (QoQ) and a 69% increase year-over-year (YoY). The company’s gross margin fell sharply to 60.5%, primarily due to a $4.5 billion charge related to writing down of H20 inventory due to the latest U.S. export restrictions imposed in early April. 

(Image credit: Nvidia)

Without the charge, Nvidia’s non-GAAP margin would have been 71.3%, still considerably lower than 78.9% in Q1 FY2025 or 73.5% in Q4 FY2025. Nvidia’s operating income was $21.6 billion, down 10% from the prior quarter but up 28% year-over-year, as for net income, it reached $18.8 billion, a 15% sequential decline but a 26% increase from the same period a year ago.


You may like

Driven by AI and gaming

Nvidia’s data center revenue set a new record $39.112 billion, comprising of $34.155 billion compute revenue and $4.957 billion networking revenue. The result represented a 10% quarter-over-quarter growth and 73% year-over-year growth, driven by surging global demand for AI infrastructure. Nvidia does not provide the split between sales of Blackwell and Hopper AI GPUs as well as Blackwell and Hopper systems, but it said that transition to Blackwell is almost complete. This means that while there are still some customers interested in Hopper processors, the vast majority of its clients now want Blackwell products. In addition, the company highlighted strong momentum in Blackwell-based systems as NVL72 GB200 machines ramped to full-scale production during the quarter.

“Our breakthrough Blackwell NVL72 AI supercomputer — a ‘thinking machine’ designed for reasoning — is now in full-scale production across system makers and cloud service providers,” said Jensen Huang, founder and CEO of Nvidia. “Global demand for Nvidia’s AI infrastructure is incredibly strong. AI inference token generation has surged tenfold in just one year, and as AI agents become mainstream, the demand for AI computing will accelerate.” 

(Image credit: Nvidia)

Nvidia’s gaming products also achieved record-breaking revenue of $3.8 billion — a 48% increase from the previous quarter and a 42% rise YoY — in the first quarter of FY2025. This growth was driven by multiple factors, including insufficient gaming GPU shipments in the previous quarter as well as  launch of Nvidia’s mainstream GeForce RTX 5070 and 5060-series products based on the Blackwell architecture. As for OEM and other segment, it generated $111 million, down 12% sequentially but up 42% year-over-year. 

Nvidia’s professional vizualization (ProViz) business reported revenue of $509 million, down from $511 million QoQ, but up 19% from $427 million in the same quarter a year go. Such results may indicate that workstation makers continued to purchase Ada Lovelace-based professional graphics cards despite the imminent release of Blackwell-based RTX Pro graphics boards in May, perhaps because of uncertainities with the U.S. tariffs. 

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.

It is noteworthy that sales of Nvidia’s client and professional GPUs — which are reported under gaming, ProViz, OEMs, and other monikers — totaled $4.42 billion, which is lower than sales of Nvidia’s networking gear. 

Nvidia’s automotive and robotics segment earned $567 million, down from $570 million in the previous quarter, but up a whopping 72% from $329 million in Q1 FY2025. 

(Image credit: Nvidia)

 Impressive outlook 

For the second quarter of fiscal 2026, Nvidia expects revenue of approximately $45.0 billion ± 2%. The company’s Q2 revenue outlook could have been $8.0 billion higher if there was no H20 export restrictions. However, the company projects GAAP gross margins of 71.8% and Nvidia’s goal is to reach mid-70% gross margins later in the year. This recovery reflects improving product mix and normalization after the Q1 inventory charge related to unsellable H20 units. 

Operating expenses in Q2 FY2026 are projected to be around $5.7 billion on a GAAP basis. The vast majority of that sum will be used for research and development (R&D).

Follow Tom’s Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.



Source link

May 29, 2025 0 comments
0 FacebookTwitterPinterestEmail
Helene Braun
Crypto Trends

Nvidia Reports Strong Results, but Outlook is Tempered

by admin May 29, 2025



Shares of Nvidia (NVDA) rose roughly 4% in post-trading hours after reporting better-than-expected earnings and revenue on Wednesday.

The AI powerhouse posted a 69% increase in revenue in the first quarter, compared to a year ago, with its data center business growing 73% year-over-year. Net income came in at $18.8 billion, up 26% from a year earlier.

The after-hours move pushed NVDA shares to a modest year-to-date gain and about a 20% year-over-year advance.

AI crypto tokens, including Bittensor , NEAR Protocol

and Internet Computer (CIP), moved slightly higher after Nvidia’s earnings beat, although remained sizably lower for the day. Nevertheless, it was ongoing AI demand which was a key driver in the 73% growth in the company data center business.

Turning to the outlook amid recent global trade uncertainties, Nvidia said it expects second-quarter revenue to come in below market estimates as a result of tariff-related restrictions between the U.S. and China.



Source link

May 29, 2025 0 comments
0 FacebookTwitterPinterestEmail
  • 1
  • 2
  • 3

Categories

  • Crypto Trends (912)
  • Esports (691)
  • Game Reviews (641)
  • Game Updates (807)
  • GameFi Guides (906)
  • Gaming Gear (871)
  • NFT Gaming (888)
  • Product Reviews (861)
  • Uncategorized (1)

Recent Posts

  • Will Bitcoin Beat Every Asset Class? Bitwise Says Institutions Are Taking Notice
  • All Jurassic World 4K Steelbooks And Box Sets You Can Buy Now
  • Nothing Beats Our Editors’ Favorite Office Chair, and It’s on Sale Now
  • Scaramucci Compares Crypto to Uber
  • Sonic Labs Proposes Token Issuance to Enter U.S. TradFi Markets

Recent Posts

  • Will Bitcoin Beat Every Asset Class? Bitwise Says Institutions Are Taking Notice

    August 21, 2025
  • All Jurassic World 4K Steelbooks And Box Sets You Can Buy Now

    August 21, 2025
  • Nothing Beats Our Editors’ Favorite Office Chair, and It’s on Sale Now

    August 21, 2025
  • Scaramucci Compares Crypto to Uber

    August 21, 2025
  • Sonic Labs Proposes Token Issuance to Enter U.S. TradFi Markets

    August 21, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

About me

Welcome to Laughinghyena.io, your ultimate destination for the latest in blockchain gaming and gaming products. We’re passionate about the future of gaming, where decentralized technology empowers players to own, trade, and thrive in virtual worlds.

Recent Posts

  • Will Bitcoin Beat Every Asset Class? Bitwise Says Institutions Are Taking Notice

    August 21, 2025
  • All Jurassic World 4K Steelbooks And Box Sets You Can Buy Now

    August 21, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 laughinghyena- All Right Reserved. Designed and Developed by Pro


Back To Top
Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop

Shopping Cart

Close

No products in the cart.

Close