Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop
Tag:

Data

NAND Flash pricing decline
Gaming Gear

AI data centers are swallowing the world’s memory and storage supply, setting the stage for a pricing apocalypse that could last a decade

by admin October 4, 2025



This free-to-access article was made possible by Tom’s Hardware Premium, where you can find in-depth news analysis, features and access to Bench.

Nearly every analyst firm and memory maker is now warning of looming shortages of NAND and DRAM that will result in skyrocketing pricing for SSDs and memory over the coming months and years, with some even predicting a shortage that will last a decade. The looming shortages are becoming increasingly impossible to ignore, and the warnings from the industry are growing increasingly dire, as the voracious appetite of AI data centers begins to consume the lion’s share of the world’s memory and flash production capacity.

For the better part of two years, storage upgrades have been a rare bright spot for PC builders. SSD prices cratered to all-time lows in 2023, with high-performance NVMe drives selling for little more than the cost of a modest mechanical hard disk. DRAM followed a similar trajectory, dropping to price points not seen in nearly a decade. In 2024, the pendulum swung firmly in the other direction, with prices for both NAND flash and DRAM starting to climb.

The shift has its roots in the cyclical nature of memory manufacturing, but is amplified this time by the extraordinary demands of AI and hyperscalers. The result is a broad supply squeeze that touches every corner of the industry. From consumer SSDs and DDR4 kits to enterprise storage arrays and bulk HDD shipments, there’s a singular throughline: costs are moving upward in a convergence that the market has not seen in years.


Don’t miss these

From glut to scarcity

The downturn of 2022 and early 2023 left memory makers in dire straits. Both NAND and DRAM were selling below cost, and inventories piled up. Manufacturers responded with drastic output cuts to stem the bleeding. By the second half of 2023, those reductions had worked their way through to sales channels. NAND spot prices for 512Gb TLC parts, which had fallen to record lows, rose by more than 100% in the span of six months, and contract pricing followed.

That rebound quickly showed up on retail shelves. Western Digital’s 2TB Black SN850X sold for upwards of $150 in early 2024, while Samsung’s 990 Pro 2TB went from a holiday low of around $120 to more than $175 within the same timeframe.

The DRAM market’s trend lagged behind NAND by a quarter, but the pattern was the same. DDR4 modules, which appeared to be clearance items in 2023, experienced a supply crunch as production lines began to wind down. Forecasts for Q3 2025’s PC-grade DDR4 products were set to jump by 38-43% quarter-over-quarter, with server DDR4 close behind at 28-33%. Even the graphics memory market began to strain. Vendors shifted to GDDR7 for next-generation GPUs, and shortfalls in GDDR6 sales inflated prices by around 30%. DDR5, still the mainstream ramp, rose more modestly but showed a clear upward slope.

Hard drives faced their own constraints. Western Digital notified partners in April 2024 that it would increase HDD prices by 5-10% in response to limited supply. Meanwhile, TrendForce recently identified a shortage in nearline HDDs, the high-capacity models used in data centers. That shortage redirected some workloads toward flash, tightening NAND supply further.

AI’s insatiable appetite

(Image credit: ServeTheHome)

Every memory cycle has a trigger, or a series of triggers. In past years, it was the arrival of smartphones, then solid-state notebooks, then cloud storage. This time, the main driver of demand is AI. Training and deploying large language models require vast amounts of memory and storage, and each GPU node in a training cluster can consume hundreds of gigabytes of DRAM and multiple terabytes of flash storage. Within large-scale data centers, the numbers are staggering.

OpenAI’s “Stargate” project has recently signed an agreement with Samsung and SK hynix for up to 900,000 wafers of DRAM per month. That figure alone would account for close to 40% of global DRAM output. Whether the full allocation is realized or not, the fact that such a deal even exists shows how aggressively AI firms are locking in supply at an enormous scale.

Cloud service providers are behaving similarly. High-density NAND products are effectively sold out months in advance. Samsung’s next-generation V9 NAND is already nearly booked before it’s even launched. Micron has presold almost all of its High Bandwidth Memory (HBM) output through 2026. Contracts that once covered a quarter now span years, with hyperscalers buying directly at the source.


Deal alert

The knock-on effects are visible at the consumer level. Raspberry Pi, which had stockpiled memory during the downturn, was forced to raise prices in October 2025 due to memory costs. The 4GB versions of its Compute Module 4 and 5 increased by $5, while the 8GB models rose by $10. Eben Upton, the company’s CEO, noted that “memory costs roughly 120% more than it did a year ago,” in an official statement on the Raspberry Pi website. Seemingly, nothing and no one can escape the surge in pricing.

Shifting investment priorities

A shortage is not simply a matter of demand rising too quickly. Supply is also being redirected. Over the past decade, NAND and DRAM makers learned that unchecked production expansion usually leads to collapse. After each boom, the subsequent oversupply destroyed margins, so the response this cycle has been more restrained.

Samsung, SK hynix, and Micron have all diverted capital expenditure toward HBM and advanced nodes. HBM, in particular, commands exceptional margins, making it an obvious priority. Micron’s entire 2026 HBM output is already committed, and every wafer devoted to HBM is one not available for DRAM. The same is true for NAND, where engineering effort and production are concentrated on 3D QLC NAND for enterprise customers.

According to the CEO of Phison Electronics, Taiwan’s largest NAND controller company, it’s this redirection of capital expenditure that will cause tight supply for, he claims, the next decade.

“NAND will face severe shortages in the next year. I think supply will be tight for the next ten years,” he said in a recent interview. When asked why, he said, “Two reasons. First… every time flash makers invested more, prices collapsed, and they never recouped their investments… Then in 2023, Micron and SK hynix redirected huge capex into HBM because the margins were so attractive, leaving even less investment for flash.”

(Image credit: Micron)

It’s these actions that are squeezing more mainstream products even tighter. DDR4 is being wound down faster than demand is tapering. Meanwhile, TLC NAND, once abundant, is also being rationed as manufacturers allocate their resources where the money is, leaving older but still essential segments undersupplied.

The same story is playing out in storage. For the first time, NAND flash and HDDs are both constrained at once. Historically, when one was expensive, the other provided a release valve, but training large models involves ingesting petabytes of data, and all of it has to live somewhere. That “warm” data usually sits on nearline HDDs in data centers, but demand is now so high that lead times for top-capacity drives have stretched beyond a year.

With nearline HDDs scarce, some hyperscalers are accelerating the deployment of QLC flash arrays. That solves one bottleneck, but creates another, pushing demand pressure back onto NAND supply chains. For the first time, SSDs are being adopted at scale for roles where cost-per-gigabyte once excluded them. The result is a squeeze from both sides, with HDD prices rising because of supply limits and SSD prices firming as cloud buyers step in to fill the gap.

Why not build even more fabs?

(Image credit: Samsung)

Fabs are being built, but they’re expensive and take a long time to get up and running, especially in the U.S. A new greenfield memory fab comes with a price tag in the tens of billions, and requires several years before volume production. Even expansions of existing lines take months of tool installation and calibration, with equipment suppliers such as ASML and Applied Materials struggling with major backlogs.

Manufacturers also remain wary of repeating past mistakes. If demand cools or procurement pauses after stockpiling, an overbuilt market could send prices tumbling. The scars of 2019 and 2022 are still fresh in their minds. This makes companies reluctant to bet on long-term cycles, even as AI demand looks insatiable today — after all, many believe that we’re witnessing an AI bubble.

Geopolitics adds yet more complexity to the conundrum. Export controls on advanced lithography equipment and restrictions on rare earth elements complicate any potential HDD fab expansion plans. These storage drives rely on Neodymium magnets, one of the most sought-after types of rare earth materials. HDDs are one of the single-largest users of rare earth magnets in the world, and China currently dominates the production of these rare earth materials. The country has recently restricted the supply of magnets as a retaliatory action against the U.S. in the ongoing trade war between the two nations.

Even if the capital were available, the supply chain for the required tools and materials is itself constrained. Talent shortages in semiconductor engineering slow the process even further. The net result is deliberate discipline, with manufacturers choosing to sell existing supply at higher margins rather than risk another collapse.

(Image credit: Samsung Semiconductor Global)

Unfortunately, manufacturers’ approaches to the matter are unlikely to change any time soon. For consumers, this puts an end to ultra-cheap PC upgrades, while enterprise customers will need larger infrastructure budgets. Storage arrays, servers, and GPU clusters all require more memory at a higher cost, and many hyperscalers make their own SSDs using custom controllers from several vendors. Larger companies, like Pure Storage, procure NAND in massive quantities for all flash arrays that power AI data centers. Some hyperscalers have already adjusted by reserving supply years in advance. Smaller operators without that leverage face longer lead times and steeper bills.

Flexibility is reduced in both cases. Consumers can delay an upgrade or accept smaller capacities, but the broader effect is to slow the adoption of high-capacity drives and larger memory footprints. Enterprises have little choice but to absorb costs, given the critical role of memory in AI and cloud workloads.

The market should eventually rebalance, but it’s impossible to predict when. New fabs are under construction, supported by government incentives, and if demand growth moderates or procurement pauses, the cycle could shift back toward oversupply.

Until then, prices for NAND flash, DRAM, and HDDs will likely remain elevated into 2026. Enterprise buyers will continue to command priority, leaving consumers to compete for what remains. And the seasonal price dips we took for granted in the years gone by probably won’t be returning any time soon.

Follow Tom’s Hardware on Google News, or add us as a preferred source, to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.



Source link

October 4, 2025 0 comments
0 FacebookTwitterPinterestEmail
Discord customer service data breach leaks user info and scanned photo IDs
Gaming Gear

Discord customer service data breach leaks user info and scanned photo IDs

by admin October 3, 2025


One of Discord’s third-party customer service providers was compromised by an “unauthorized party,” the company says. The unauthorized party gained access to “information from a limited number of users who had contacted Discord through our Customer Support and/or Trust & Safety teams” and aimed to “extort a financial ransom from Discord.” The unauthorized party “did not gain access to Discord directly.”

Data potentially accessed by the hack includes things like names, usernames, emails, and the last four digits of credit card numbers. The unauthorized party also accessed a “small number” of images of government IDs from “users who had appealed an age determination.” Full credit card numbers and passwords were not impacted by the breach, Discord says.

The company is notifying impacted users now over email. If your ID might have been accessed, Discord will specify that. Discord also says it revoked the support provider’s access to Discord’s ticketing system, has notified data protection authorities, is working with law enforcement, and has reviewed “our threat detection systems and security controls for third-party support providers.”



Source link

October 3, 2025 0 comments
0 FacebookTwitterPinterestEmail
Data breach
Gaming Gear

US Air Force investigating data breach caused by Microsoft SharePoint issue

by admin October 2, 2025



  • US Air Force investigating SharePoint breach exposing PII and PHI across its systems
  • Chinese-linked groups exploited SharePoint flaws
  • Microsoft and US authorities are actively investigating the scope and impact of the breach

The US Air Force is reportedly investigating a potential data breach caused by a Microsoft SharePoint issue.

A report from The Register revealed the Air Force Personnel Center Directorate of Technology and Information issued a data breach notification shared on social media.

“This message is to inform you of a critical Personally Identifiable Information (PII) and Protected Health Information (PHI) exposure related to USAF SharePoint Permissions,” the warning reads. “As a result of this breach, all USAF SharePoints will be blocked Air Force-wide to protect sensitive information.”


You may like

Big names

The Register reported Microsoft Teams and Power BI dashboards should also be blocked since they access SharePoint, but this information is unconfirmed at this time.

“The Department of the Air Force is aware of a privacy-related issue,” an Air Force spokesperson told The Register.

Further information out there is scarce right now, with little information on who the threat actors are and what they sought to achieve.

Obviously, most fingers are now being pointed towards China, following reports in early July 2025 that Microsoft had confirmed three Chinese-affiliated hacking groups exploited vulnerabilities in on-prem SharePoint servers.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

The groups, called Linen Typhoon, Violet Typhoon, and Storm-2603, targeted flaws that allowed authentication bypass and remote code execution, which enabled them to steal sensitive data such as MachineKey information.

These exploits affected at least two US federal agencies and numerous other organizations globally. The situation is being actively investigated by both Microsoft and US authorities.

However, we should also not forget Russian state-sponsored groups, who have the skills and the infrastructure to pull this kind of attack off, and have done so in the not-too-distant-past, as well.

Previously, Microsoft faced US government fire over its lax cybersecurity approach, which even forced it to change how it operated – let’s see if this time it is any different.

You might also like



Source link

October 2, 2025 0 comments
0 FacebookTwitterPinterestEmail
Whales Keep Stacking Aster: Data Reveals 8% Controlled By Two Wallets
GameFi Guides

Whales Keep Stacking Aster: Data Reveals 8% Controlled By Two Wallets

by admin October 1, 2025


Trusted Editorial content, reviewed by leading industry experts and seasoned editors. Ad Disclosure

Aster is cooling off after a week of explosive gains, losing more than 35% of its value since hitting an all-time high just days ago. The sharp correction has triggered caution among traders, but it also reflects natural profit-taking after such a rapid surge. Despite the retracement, sentiment in the market remains constructive, with many investors still anticipating further upside in the coming weeks.

One of the main drivers behind this optimism is whale activity. Onchain data shows that large holders continue to accumulate Aster during the dip, a signal that often strengthens confidence in the asset’s long-term outlook. Their consistent buying suggests conviction in the project’s fundamentals, even as price action cools in the short term.

Meanwhile, excitement around Aster continues to build. The platform has generated strong traction, and community interest has yet to fade despite the recent pullback. This combination of whale accumulation and growing DEX momentum highlights why many see the correction as an opportunity rather than the end of the rally.

Whale Accumulation Strengthens Aster’s Position

Fresh on-chain data highlights that whales continue to build significant exposure to Aster. According to Lookonchain, wallet 0xFB3B withdrew another 3.19 million ASTER — worth approximately $5.27 million — from Gateio just six hours ago.

Combined with another large holder, the two wallets now control 132.78 million ASTER, valued at $218 million. This concentration represents 8.01% of the circulating supply, underscoring the confidence whales have in its long-term trajectory.

Such activity comes at a time when the broader market is buzzing with what many call “DEX season.” Decentralized exchanges have drawn increasing attention as traders seek alternatives to centralized platforms and look for more transparency, control, and composability. Perpetual DEXs in particular have surged in popularity, with projects like Hyperliquid and Avantis capturing strong user interest.

Aster, however, is positioning itself firmly in this competitive landscape. Despite recent volatility and a 35% pullback from its all-time high, the project continues to attract capital and community engagement.

Whale accumulation suggests that sophisticated investors see Aster as one of the contenders capable of holding its ground alongside leading perpetual platforms. Its growing liquidity base and active ecosystem make it well placed to capture a share of the demand fueling the current decentralized trading boom.

In short, while short-term price action remains choppy, whale activity and the ongoing DEX narrative provide strong tailwinds. If Aster sustains momentum and continues to scale, it could solidify itself as a serious competitor in the battle for dominance among next-generation perpetual DEXs.

Aster Rebounds After Sharp Correction

Aster is trading around $1.72 after a steep decline from last week’s all-time high above $2.60. The 2-hour chart highlights the intensity of the recent correction, with price falling more than 35% in just a few days before finding support near the $1.55 zone. This level acted as a short-term floor, triggering a rebound as buyers stepped back in.

Price Testing Critical Resistance | Source: ASTERUSDT chart on TradingView

Currently, ASTER is attempting to reclaim ground above its short-term moving average (blue), but momentum remains fragile. Volume spikes during the sell-off show that profit-taking dominated market activity, while the rebound so far has come with lighter volume, suggesting that conviction among buyers has not yet fully returned. The $1.80 level now stands as the first key resistance. If bulls can push through it, the next challenge lies around $2.00, where the 100-period moving average (green) is converging.

On the downside, failure to hold $1.60 could invite another wave of selling, potentially dragging ASTER toward $1.40. Despite this short-term weakness, the broader trend remains fueled by whale accumulation and rising interest in Aster’s DEX ecosystem. If momentum stabilizes, the rebound could evolve into a stronger recovery in the coming sessions.

Cover image from ChatGPT, ASTERUSD chart from Tradingview

Editorial Process for bitcoinist is centered on delivering thoroughly researched, accurate, and unbiased content. We uphold strict sourcing standards, and each page undergoes diligent review by our team of top technology experts and seasoned editors. This process ensures the integrity, relevance, and value of our content for our readers.



Source link

October 1, 2025 0 comments
0 FacebookTwitterPinterestEmail
AI Agent
Gaming Gear

Garbage in, Agentic out: why data and document quality is critical to autonomous AI’s success

by admin October 1, 2025



There is a lot of optimism about the future of agentic AI because it promises to drive higher levels of digital transformation by autonomously handling complex, multi-step tasks with accuracy, speed, and scalability.

Much of the buzz around AI agents is due to their ability to make decisions without human intervention, freeing up skilled talent for strategic work, and scaling decision-making without adding headcount.

That said, how can companies go beyond the hype to gain a better understanding of how agentic AI can drive higher efficiency and return on investment?


You may like

According to PwC, there is growing interest translating into IT investments in agentic AI. In its May 2025 survey, 88% of respondents said their team or line of business plans to increase AI-related budgets in the next year because of agentic AI.

Scott Francis

Social Links Navigation

Technology Evangelist at PFU America, Inc.

And 79% reported AI agents are already being adopted in their companies, and of those that have adopted agents, two-thirds (66%) claim they’re delivering measurable value through increased productivity.

But there are some clouds on the horizon: Gartner predicts that more than 40% of agentic AI projects will be canceled by the end of 2027 due to escalating costs, unclear business value, or inadequate risk controls.

However, done right, and with proper preparation, agentic AI has the potential to be far more disruptive than generative AI because of its direct impact on business KPIs such as cost reduction, faster decision-making, and task completion.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Agents driving healthcare transformation

Early-adopter use cases are already showing promise. Take Nvidia for example. The AI innovator is developing an enterprise AI platform to create task-specific AI agents, including one for The Ottawa Hospital that will handle patients’ pre-operative questions 24/7.

This includes providing details on how to get ready for surgery, and on post-surgery recovery and rehabilitation. According to Kimberly Powell, vice president and general manager of healthcare at Nvidia, AI agents can save providers time and money, while also enhancing the patient experience.

However, the investment in agentic AI is a waste of time, money, and resources if the input models are receiving outdated, poor quality, or inaccurate data. In the case of The Ottawa Hospital agent in development, it relies on well-organized, accurate, up-to-date patient information to drive decision making and automate tasks.


You may like

Healthcare is just one potential use case for agentic AI. Businesses in almost any industry stand to benefit from improved efficiency through task automation, reduction in human error, and scaled decision making in applications ranging from customer support, procurement, IT operations, and more.

Data and document quality determine agentic AI effectiveness

Unlike GenAI — a very useful content creation tool — agentic AI acts autonomously, which is why data and document quality is even more imperative. The LLMs at the core of agents require clean, validated, and secure data because agents’ actions and decision making are only as good as the data and rules it’s given.

Agentic AI relies on structured data and digitized documents to make decisions, trigger workflows, or generate outputs. Bottom line: inaccurate, outdated, or incomplete data directly skews the logic the AI uses to act.

One scenario illustrating how agents can go terribly wrong is in bank loan applications. If the financial data from scanned forms or other inputs is outdated, it could lead the AI to approve a high-risk applicant, increasing the potential for bank losses.

For non-digital documentation, hard copies that have been scanned using old equipment with low resolution and poor image quality can confuse optical character recognition (OCR) and natural language processing (NLP) systems, leading the agent to misinterpret content.

Advanced, high-speed imaging scanners that rotate skewed documents, offer 300 DPI resolution, and utilize adaptive thresholding to enhance characters, remove stains, watermarks, and background noise are ideal for accurate OCR recognition – and more accurate results.

Data preparation makes all the difference

To stop autonomous agents from “hallucinating” or delivering poor decisions that may could impact operational efficiency, organizations should follow industry leading data management and retention best practices to prepare data sets prior introduction to an LLM, including:

  • Preprocess and clean data – Without consistently doing data “spring cleaning” even the most advanced AI will struggle and be less effective. It’s critical to remove duplicate documents and data, outdated versions, and corrupt files. Using AI for document classification, summarization, and cleanup dramatically speeds up the process while reducing manual effort. Even fixing typos, formatting issues, and inconsistent structures in scanned documents and PDFs will improve AI inputs, minimizing the potential for “garbage in, garbage out.”
  • Classify and tag documents – Once the data has been cleansed and processed, apply metadata labels — such as “sales presentation” or “HR training manual” — to documents for easier identification and then organize content into semantic categories relevant to business processes. Giving documents structure enables agents to gain a better understanding of context and relevance.
  • Preserve data confidentiality – It’s critical that all AI systems only have access to the data and documents they need, and nothing more. This also applies to the use of external APIs or tools. Sensitive, personal data that’s no longer needed should be sanitized and erased permanently to minimize risks related to data privacy, leakage, or compliance violations.
  • Test and analyze – Finally, run tests on sample prompts using smaller document sets and then analyze the outputs. Using feedback loops and refine data sources and formatting before scaling up occurs. This important step will enable IT teams to catch formatting issues, hallucinations, or data misinterpretations early.

The quality imperative and the future of agentic AI

There’s a lot riding on the promises of autonomous AI with spending projected to reach $155 billion by 2030. However, for agentic AI to be accurate, reliable, and support compliance, organizations must prioritize data and document quality.

By adopting best practices that prioritize clean, well-governed data, and clear documentation, organizations can ensure the AI agents they’re employing operate with precision and integrity. In a future shaped by autonomous systems, high-quality information isn’t just an asset, it’s a prerequisite for trusted and effective agentic output.

We’ve featured the best cloud document storage.

This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro



Source link

October 1, 2025 0 comments
0 FacebookTwitterPinterestEmail
New Life-Giving Molecules Found in 17-Year-Old Data From Saturn’s Moon Enceladus
Gaming Gear

New Life-Giving Molecules Found in 17-Year-Old Data From Saturn’s Moon Enceladus

by admin October 1, 2025



The south pole of Enceladus—a tiny moon orbiting Saturn—is a volatile place. In this region, the moon’s subsurface ocean spews jets of water through four “tiger stripe” cracks in the icy crust, culminating in a single plume of ice particles that stretches hundreds of miles into space.

The Cassini spacecraft spent two decades studying these particles to search for evidence of habitability on Enceladus. In 2008, the probe flew straight through the icy plume to study particles that were ejected only minutes before they hit the spacecraft’s Cosmic Dust Analyzer (CDA). More than 15 years later, scientists have finally deciphered this data, finding that the particles contained organic molecules never seen in Enceladus’s ejections before.

The study, published Wednesday in the journal Nature Astronomy, explains that the newly detected molecules include those involved in chains of chemical reactions that ultimately give rise to more complex molecules that are essential for life on Earth, according to the researchers.

“There are many possible pathways from the organic molecules we found in the Cassini data to potentially biologically relevant compounds, which enhances the likelihood that the moon is habitable,” said lead author Nozair Khawaja, a researcher at Freie Universität Berlin, in a statement from the European Space Agency (ESA).

The search for signs of habitability on Enceladus

Enceladus’s subsurface ocean has captivated astrobiologists ever since the Cassini mission, a joint endeavor between NASA, ESA, and the Italian Space Agency (ASI), first discovered evidence of it in 2014. Life as we know it can’t exist without water, so a moon with a vast reserve of the stuff is a pretty good place to look for life-giving molecules.

Cassini orbited Saturn from 2004 to 2017 before dramatically plunging into the ringed planet. During this time, the probe detected many organic molecules—including phosphorus and precursors for amino acids—as it flew through Saturn’s E ring, which is largely made of water-ice ejected from Enceladus.

However, grains of ice in the E ring can be hundreds of years old. As they age, they may lose some traces of organic molecules present in Enceladus’s subsurface ocean. To get a better understanding of what’s really going on down there, Khawaja and his colleagues set out to analyze data taken from a fresher source.

Getting closer to the source

The researchers specifically looked at data Cassini gathered during its foray into Enceladus’s icy plume. The freshly ejected particles slammed into the spacecraft’s CDA instrument at high speed—roughly 11 miles (18 kilometers) per second.

The speed of impact proved to be equally important as the particles’ freshness. “At lower impact speeds, the ice shatters, and the signal from clusters of water molecules can hide the signal from certain organic molecules,” Khawaja explained. “But when the ice grains hit CDA fast, water molecules don’t cluster, and we have a chance to see these previously hidden signals.”

This would explain why he and his colleagues uncovered new organic molecules in this data. They also detected some that had already been found in the E ring, confirming that they come from Enceladus’s ocean. This contradicts recent evidence that suggests these molecules may actually stem from radiation-driven chemistry on the moon’s surface and inside its plumes.

While the findings strengthen the case for Enceladus’s habitability, there is still much work to be done to confirm whether this icy wasteland can support life. ESA aims to launch another mission to explore this distant moon, this time searching for signs of habitability on the surface.

​​“Even not finding life on Enceladus would be a huge discovery, because it raises serious questions about why life is not present in such an environment when the right conditions are there,” Khawaja said.



Source link

October 1, 2025 0 comments
0 FacebookTwitterPinterestEmail
The EPA Is Ending Greenhouse Gas Data Collection. Who Will Step Up to Fill the Gap?
Gaming Gear

The EPA Is Ending Greenhouse Gas Data Collection. Who Will Step Up to Fill the Gap?

by admin October 1, 2025


The Environmental Protection Agency announced earlier this month that it would stop making polluting companies report their greenhouse gas emissions to it, eliminating a crucial tool the US uses to track emissions and form climate policy. Climate NGOs say their work could help plug some of the data gap, but they and other experts fear the EPA’s work can’t be fully matched.

“I don’t think this system can be fully replaced,” says Joseph Goffman, the former assistant administrator at the EPA’s Office of Air and Radiation. “I think it could be approximated—but it’s going to take time.”

The Clean Air Act requires states to collect data on local pollution levels, which states then turn over to the federal government. For the past 15 years, the EPA has also collected data on carbon dioxide, methane, and other greenhouse gases from sources around the country that emit over a certain threshold of emissions. This program is known as the Greenhouse Gas Reporting Program (GHGRP) and “is really the backbone of the air quality reporting system in the United States,” says Kevin Gurney, a professor of atmospheric science at Northern Arizona University.

Like a myriad of other data-collection processes that have been stalled or halted since the start of this year, the Trump administration has put this program in the crosshairs. In March, the EPA announced it would be reconsidering the GHGRP program entirely. In September, the agency trotted out a proposed rule to eliminate reporting obligations from sources ranging from power plants to oil and gas refineries to chemical facilities—all major sources of greenhouse gas emissions. (The agency claims that rolling back the GHGRP will save $2.4 billion in regulatory costs, and that the program is “nothing more than bureaucratic red tape that does nothing to improve air quality.”)

Joseph says shutting down this program hamstrings “the government’s basic practical capacity to formulate climate policy.” Understanding how new emissions-reduction technologies are working, or surveying which industries are decarbonizing and which are not, “is extremely hard to do if you don’t have this data.”

Data collected by the GHGRP, which is publicly available, underpins much of federal climate policy: understanding which sectors are contributing which kinds of emissions is the first step in forming strategies to draw those emissions down. This data is also the backbone of much of international US climate policy: collection of greenhouse gas emissions data is mandated by the UN Framework Convention on Climate Change, which undergirds the Paris Agreement. (While the US exited the Paris Agreement for the second time on the first day of Trump’s second term, it remains—tenuously—a part of the UNFCCC.) Data collected by the GHGRP is also crucial to state and local climate policies, helping policymakers outside the federal government take stock of local pollution, form emissions-reductions goals, and track progress on bringing down emissions.



Source link

October 1, 2025 0 comments
0 FacebookTwitterPinterestEmail
trash, money
Crypto Trends

Crypto Miner TeraWulf to Raise $3B in Google-Backed Debt Deal to Expand Data Centers

by admin September 28, 2025



Crypto mining firm TeraWulf (WULF) is planning to raise $3 billion in debt to expand its data center operations in a deal supported by Google, as the AI infrastructure arms race intensifies.

The company, Bloomberg reports citing TeraWulf CEO Patrick Fleury, is working with Morgan Stanley to arrange the funding, which could launch as early as next month through high-yield bonds or leveraged loans.

Credit rating agencies are evaluating the deal, and Google’s support may help it secure a stronger credit rating than would be typical for the firm.

The AI industry’s hunger for data center space, chips, and electricity has attracted crypto miners unlikely partners, which already control power-intensive infrastructure that can be repurposed for AI workloads.

Google, which recently increased its backstop for TeraWulf to $3.2 billion, now holds a 14% stake in the company. That support helped AI cloud platform Fluidstack expand its use of a TeraWulf-run data center in New York in August.

Other crypto-native firms are following suit. Cipher Mining struck a similar agreement with Google and Fluidstack this week. Google will also backstop $1.4 billion in obligations tied to that deal and take an equity stake in Cipher.

TeraWulf shares dropped around 1.3% in Friday’s trading session and were unchanged in after-hours trading.



Source link

September 28, 2025 0 comments
0 FacebookTwitterPinterestEmail
A hacker in a Guy Fawkes mask using an Apple MacBook.
Gaming Gear

Fraudulent GitHub Pages impersonate trusted companies to trick Mac users into installing malware, leaving financial and personal data at risk

by admin September 24, 2025



  • Atomic Stealer malware installs silently via fake GitHub Pages targeting Mac users
  • Attackers create multiple GitHub accounts to bypass platform takedowns repeatedly
  • Users copying commands from unverified websites risk serious system compromise

Cybersecurity researchers are warning Apple Mac users about a campaign using fraudulent GitHub repositories to spread malware and infostealers.

Research from LastPass Threat Intelligence, Mitigation, and Escalation (TIME) analysts found attackers are impersonating well-known companies to convince people to download fake Mac software.

Two fraudulent GitHub pages pretending to offer LastPass for Mac were first spotted on September 16 2025 under the username “modhopmduck476.”


You may like

How the attack chain works

While these particular pages have been taken down, the incident suggests a broader pattern that continues to evolve.

The fake GitHub pages included links labeled “Install LastPass on MacBook,” which redirected to hxxps://ahoastock825[.]github[.]io/.github/lastpass.

From there, users were sent to macprograms-pro[.]com/mac-git-2-download.html and told to paste a command into their Mac’s terminal.

That command used a CURL request to fetch a base64-encoded URL that decoded to bonoud[.]com/get3/install.sh.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

The script then delivered an “Update” payload that installed Atomic Stealer (AMOS malware) into the Temp directory.

Atomic Stealer, which has been active since April 2023, is a known infostealer used by financially motivated cybercrime groups.

Investigators have linked this campaign to many other fake repositories impersonating companies ranging from financial institutions to productivity apps.


You may like

The list of targeted names includes 1Password, Robinhood, Citibank, Docker, Shopify, Basecamp, and numerous others.

Attackers appear to create multiple GitHub usernames to bypass takedowns, using Search Engine Optimization to push their malicious links higher on search results in Google and Bing.

This technique increases the chances that Mac users searching for legitimate downloads will encounter the fraudulent pages first.

LastPass states it is “actively monitoring this campaign” while working on takedowns and sharing indicators of compromise to help others detect threats.

The attackers’ use of GitHub Pages reveals both the convenience and the risks of community platforms.

Fraudulent repositories can be set up quickly, and while GitHub can remove them, attackers often return under new aliases.

This cycle raises questions about how effectively such platforms can protect users.

How to stay safe

  • Only download software from verified sources to avoid malware and ransomware risks.
  • Avoid copying commands from unfamiliar websites to prevent unauthorized code execution.
  • Keep macOS and all installed software up to date to reduce vulnerabilities.
  • Use the best antivirus or security software that includes ransomware protection to block threats.
  • Enable regular system backups to recover files if ransomware or malware strikes.
  • Stay skeptical of unexpected links, emails, and pop-ups to minimize exposure.
  • Monitor official advisories from trusted vendors for timely security updates and guidance.
  • Configure strong, unique passwords and enable two-factor authentication for important accounts.

You might also like



Source link

September 24, 2025 0 comments
0 FacebookTwitterPinterestEmail
An image of an experimental cooling solution developed by Microsoft, that uses microfluidics to get coolant directly into the processor's silicon.
Product Reviews

Microsoft is resorting to laser etching AI-designed cooling channels directly into data center chips to tame their massive heat

by admin September 24, 2025



Introducing microfluidic cooling: a breakthrough in chip cooling technology – YouTube

Watch On

If you think the power consumption of today’s gaming graphics cards is bad, it’s nothing compared to how energy the massive processors in AI and data systems use. All that power ends up as heat, resulting in chip cooling being a serious challenge. Microsoft reckons it has a great solution, though, and it’s all about getting water into the processors themselves.

The most complex direct-die, liquid cooling loops you’ll see in a gaming PC all involve using a chamber that mounts on top of the CPU. At no point does the coolant ever touch the chip directly. In a recently published blog, Microsoft explains how it has developed a system that does precisely that.

By etching the surface of the processor die with an intricate pattern of tiny channels, water can then be pumped directly into the silicon itself, albeit to a very shallow depth.


Related articles

The keyword to describe this is microfluidics, a technology that’s been around for many decades, and if the history of consumer tech is anything to go by, it’ll be a phrase plastered across every CPU cooler within a couple of years (though not actually do anything).

This might all just seem like Microsoft is cutting a few grooves into the chip and having water to flow through it, but it’s far more complicated than that. For a start, the channels themselves are no wider than a human hair, and they’re not just simple lines either. Microsoft employed the services of Swiss firm Corintis, which used AI to determine the best pattern for maximum heat transfer.

(Image credit: Microsoft)

The end result is a network of microchannels that genuinely look organic, though at first glance, you’d be forgiven for thinking the complex patterns were just manufacturing defects. It certainly looks super cool (pun very much intended).

Microsoft claims the tech is up to three times more effective at removing heat from a massive AI GPU than a traditional cold plate (aka waterblock), citing a 65% reduction in the maximum temperature rise of the silicon.

Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.

Since all the coolant transfer apparatus doesn’t need to be right on top of the microchannels, the system can also be applied to stacked chips, with each one etched before mounting. This way, each die within the stack is cooled individually, meaning they can operate closer to their maximum specifications than with a normal cold plate.

Take AMD’s X3D processors, for example. These all have one stacked chip underneath the heatsink: a Core Complex Die (CCD) bonded to a 3D V-Cache die. Each one acts as a thermal barrier to the other, though the CCD does generate much more heat than the cache die. If these could be both cooled via microfluidics, you’d be able to operate them both at higher clock speeds.

Of course, such complex tech isn’t cheap to develop or implement, and the likelihood of it ever appearing at the consumer level is very slim. But I wouldn’t be surprised if somebody takes an RTX 5090, rips off the heatsink, and swaps it for a homebrewed microfluidic cooler.

There again, if ramping up power consumption is the only way AMD, Intel, and Nvidia can keep improving chip performance, perhaps we might see etched processors and direct-die cooling being standard fare in our gaming PCs. After all, it wasn’t that long ago when heatpipes and vapour chambers were phrases never to be uttered by a PC component manufacturer, but now they’re in coolers of every kind.

Best CPU coolers 2025

All our current recommendations



Source link

September 24, 2025 0 comments
0 FacebookTwitterPinterestEmail
  • 1
  • 2
  • 3
  • …
  • 6

Categories

  • Crypto Trends (1,098)
  • Esports (800)
  • Game Reviews (752)
  • Game Updates (906)
  • GameFi Guides (1,058)
  • Gaming Gear (960)
  • NFT Gaming (1,079)
  • Product Reviews (960)

Recent Posts

  • Heart Machine ends development on Hyper Light Breaker mere months after it entered early access
  • Blatant Animal Crossing Rip-Off Somehow Lands On The PS5 Store
  • Beloved co-operative platformer Pico Park: Classic Edition has been accidentally made free on Steam forever
  • Fortnite Creators Accused Of Running A Bot Scam For Big Payouts
  • “Incredibly moved and grateful” – Clair Obscur: Expedition 33’s director talks success, “art house” aspirations and the scope of future projects

Recent Posts

  • Heart Machine ends development on Hyper Light Breaker mere months after it entered early access

    October 9, 2025
  • Blatant Animal Crossing Rip-Off Somehow Lands On The PS5 Store

    October 9, 2025
  • Beloved co-operative platformer Pico Park: Classic Edition has been accidentally made free on Steam forever

    October 9, 2025
  • Fortnite Creators Accused Of Running A Bot Scam For Big Payouts

    October 9, 2025
  • “Incredibly moved and grateful” – Clair Obscur: Expedition 33’s director talks success, “art house” aspirations and the scope of future projects

    October 9, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

About me

Welcome to Laughinghyena.io, your ultimate destination for the latest in blockchain gaming and gaming products. We’re passionate about the future of gaming, where decentralized technology empowers players to own, trade, and thrive in virtual worlds.

Recent Posts

  • Heart Machine ends development on Hyper Light Breaker mere months after it entered early access

    October 9, 2025
  • Blatant Animal Crossing Rip-Off Somehow Lands On The PS5 Store

    October 9, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 laughinghyena- All Right Reserved. Designed and Developed by Pro


Back To Top
Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop

Shopping Cart

Close

No products in the cart.

Close