Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop
Tag:

safety

Decrypt logo
NFT Gaming

OCC Cites ‘Safety and Soundness’ for Crypto Bank Anchorage in Pulling Consent Order

by admin August 24, 2025



In brief

  • The OCC terminated its consent order on digital assets bank Anchorage Digital.
  • The regulator brought the order in 2022 after granting conditional approval to Anchorage in 2021.
  • Federally chartered Anchorage custodies some of the BTC and ETH held in BlackRock’s spot ETFs.

The Office of the Comptroller of Currency (OCC) announced Thursday that it has terminated its cease and desist consent order against Anchorage Digital.

The regulator first issued a consent order to Anchorage, a federally chartered digital asset bank, in 2022 due to its “failure to adopt and implement a compliance program” that satisfactorily covered the Bank Secrecy Act and anti-money laundering (AML) requirements. 

“The OCC believes that the safety and soundness of the bank and its compliance with laws and regulations does not require the continued existence of the order,” the termination order reads. 

In 2021, Anchorage Digital made history when the @USOCC granted us a national bank charter to serve as a full-scale digital asset bank, providing custody, trading, settlement, governance, and other regulated services for institutions. pic.twitter.com/sMKwq3tTfv

— Anchorage Digital ⚓ Prime is Live (@Anchorage) August 21, 2025

Anchorage Digital received conditional approval from the OCC in 2021, allowing it to offer crypto custody services to its customers and making it the first federally chartered bank to custody digital assets. After demonstrating the appropriate compliance, the consent order has now been terminated. 

“When we applied for that charter, we knew what we were signing up for: the path forward was uncharted for any crypto company, and at the time, many in our industry—and most of Washington—felt that digital assets and regulation were like oil and water,” said Anchorage co-founder and CEO Nathan McCauley in a statement Thursday. 



“We embarked on that path not because it was easy, but because we knew it was the right long-term move for the industry—laying the foundation for trust, safety, and durability in the years ahead,” he added. “And in an industry intent on ‘going to the moon,’ the seeming impossibility of our federal charter mission lit a fire under us from the start.”

The South Dakota-based firm specializes in custody, staking, trading, and governance for its members. In April, BlackRock chose Anchorage to custody some of the Bitcoin and Ethereum held for the asset manager’s industry-leading spot ETFs. 

In May, the OCC affirmed that national banks it oversees can buy, sell, and manage any crypto assets in their custody. Since that time, stablecoin issuer Circle as well as Ripple and Paxos have applied for charters that would make them nationally regulated banks. 

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.





Source link

August 24, 2025 0 comments
0 FacebookTwitterPinterestEmail
4Chan, Gab and Kiwi Farms want Trump’s help to dodge the Online Safety Act
Gaming Gear

4Chan, Gab and Kiwi Farms want Trump’s help to dodge the Online Safety Act

by admin August 24, 2025


After the United Kingdom began enforcing its sweeping Online Safety Act in April, British regulator Ofcom served violation notices to three notorious sites: 4chan, Gab, and Kiwi Farms, each of which risked multimillion-dollar fines. Late last week, Preston Byrne, a First Amendment lawyer representing them, struck back. Byrne announced he would sue Ofcom in US federal court and added an unusual request. He called on the Trump administration “to invoke all diplomatic and legal levers available to the United States” to protect his clients from the OSA’s reach.

Byrne’s request could put a trio of sites known as hotbeds of violence, harassment, and extremism at the vanguard of the Trump administration’s sweeping new diplomatic mandate: stop foreign countries from using their laws to stifle American speech — especially hate speech — on the internet.

In an interview with The Verge, Byrne said that he’d already been in communications with Congressional offices and administration officials who were following not just this case, but other enforcement incidents he’d flagged in Europe. While the Biden administration didn’t visibly intervene in European investigations into American websites, Byrne claimed that current members of the “U.S. Federal Government” were “very hungry for information, for solid, actionable information, about this… as a free speech activist, I’ve been impressed, I’ve been humbled, I’m immensely grateful to our government, and how they’re responding. I have nothing bad to say about how the government has handled this.”

International internet regulation has expanded as the US political right has gained force online, fueling a backlash against, in particular, the European Union’s Digital Services Act and the UK’s OSA. In February, Vice President J.D. Vance told a shocked crowd at the Munich Security Conference that “in Britain, and across Europe, free speech, I fear, is in retreat,” implicitly threatening to withdraw defense funding — an existential need for the E.U. as Russia’s invasion of Ukraine continued — if they did not relent. Secretary of State Marco Rubio began restricting visas for foreign nationals who enforce laws against American companies for violating content moderation laws and recently began instructing its embassies to begin pushing back against their European counterparts, sending along talking points in a cable sent in August.

And the OSA has faced a rocky rollout in the UK. The law can penalize platforms for not verifying users’ ages before they access pornographic or otherwise “harmful” content, or for failing to remove illegal material. When it took effect in late July, several major U.S. companies — including Reddit, Bluesky, X, and Grindr — were forced to implement age verification systems that haphazardly blocked some or all access for users who didn’t want to hand over an ID or face scan. Wikipedia has expressed concerns it would have to expose anonymous editors and moderators to comply with the OSA, and is currently suing in UK court.

Byrne’s legal goal, if Trump doesn’t intervene, is more aggressive than Wikipedia’s: he wants a US federal court to declare that the OSA is not enforceable on American companies. “Reportedly, they [the U.S. government] have pushed back on the UK on this one issue, but ultimately, it doesn’t matter. Because one lawyer, a solo practitioner working in his free time, armed with the First Amendment, can bring the OSA to a grinding halt at the shoreline of the United States.”

But he and associates are also pushing hard for a backchannel deal, and Byrne told The Verge that he had begun reaching out to members of the administration on behalf of his clients after Trump was elected. “The relevant client and I looked at each other and I said, listen, I think we’ll have a lot easier time contacting some people in the DOJ and saying, ‘Hey, did you know that this is happening and it’s infringing on Americans’ free speech rights?’”

The Verge confirmed that Byrne had made contact with Congressional offices; the State Department did not return a request for comment regarding whether they were in contact with Byrne. Although Byrne said was not in active conversation with the White House or Congress regarding this case (“I wouldn’t call them ‘partners,’ the communication between our legal team and [the government] has been mostly one way”) his clients had been seeing quiet results. Previously, the Biden Administration had been serving notices from Germany to one of Byrne’s clients for violating the online safety law NetzDG, but Byrne argued that they had done so in a way that circumvented the Mutual Legal Assistance Treaty. “When we made contact with the [Trump] government over Ofcom, we disclosed the misuse of the MLAT procedure to serve foreign censorship demands under the Biden Administration,” he continued. “The notices [from Germany] have since stopped.”

The Trump administration’s definition of a “diplomatic solution” might be more aggressive than a lawsuit. In July it raised tariffs on Brazil by 40 percent after Brazilian Supreme Court Justice Alexandre de Morales charged U.S.-based companies and U.S. citizens with legal violations for their social media content; earlier that month, Rumble and Trump Media, the Trump-founded company that owns Truth Social, filed a joint lawsuit alleging that Morales was targeting their users’ American rights to privacy. (Morales’s visa was also revoked by the State Department, as well as those of several other Brazilian judges.)

But Rumble and Truth Social — as well as more mainstream platforms like Reddit, Wikipedia and Bluesky — have less baggage than Byrne’s latest clients. Gab, Kiwi Farms, and 4Chan have reputations as cultivated sources of sexist, racist, and white nationalist content, linked to acts of fatal violence and harassment. Gab, a proudly and openly white nationalist social media site which has long refused to remove antisemitic content from their platform, went temporarily offline in 2018 after a mass shooter used it to announce his attack on the Tree of Life synagogue in Pittsburgh, Pennsylvania. The Kiwi Farms community organizes harassment campaigns — with particular vitriol against transgender people — that have been tied to multiple suicides. 4Chan, the primordial soup of unsavory internet culture, has helped spawn, among other things, mass shootings, QAnon, and Gamergate.

These sites allow their users to post anonymously, and they’re unsurprising targets for Ofcom, whose initial complaint against 4Chan said that the site had failed to offer a risk assessment about its userbase and was not complying with Ofcom “safety duties.” The complaint said 4chan could be subject to the law’s general fine of either £18 million or 10 percent of qualifying worldwide revenue, whichever is greater. Ofcom declined to comment, citing the complaint’s status as an ongoing investigation. (A fourth site, which offers information about methods of suicide, was also targeted; Byrne says he’s been in contact but does not currently represent it.)

Byrne is no stranger to representing lighting-rod, right-wing tech companies in court. Parler, a platform founded as a conservative-friendly alternative to Facebook, was among his former clients. “I’ve been saying no to foreign governments for eight years, because I was willing to represent free speech websites,” he told The Verge, and from his perspective, these were simply three more sites whose First Amendment rights were being targeted by Europeans. “The First Amendment allows Americans to talk to foreigners, to grant anonymity to foreigners, and not censor foreigners,” he said. “The First Amendment does not disappear because there is a contrary foreign rule on foreign shores.”

The US government directly defending them, instead of sticking with a safer embattled platform as a poster child, would be a show of force — and if successful, a demonstration that the OSA is toothless against any service with Trump’s backing, no matter how extreme its content. The administration’s protection of American speech abroad would stand in stark contrast with its approach inside the country, where the same State Department that’s pushing back against Europe’s digital laws is also using social media posts to deny and revoke student visa applications, targeting them for posting pro-Palestine content online.

Murky battles over digital sovereignty date back to the dawn of the internet, said Milton Mueller, the head of the Internet Governance Project and a professor at Georgia Tech. In 2000, he notes, the French government sued Yahoo for hosting an auction site that sold Nazi artifacts and was globally accessible — including to users in France, where buying and selling Nazi memorabilia is criminalized. Yahoo, which is based in the U.S., argued that they and their users were protected under America’s First Amendment rights. Eventually, they came to an agreement to simply block the objectionable Nazi content in France, which soon became the prevailing solution to any issue of social media content infringing laws in other countries.

“It was an undermining of the global accessibility of information, and one of the first steps towards the fragmentation of internet content into the territorial jurisdictions of states,” he told The Verge.

In addition to seeking to avoid potential fines posed by the OSA, Byrne wants to break that detente. “None of my clients, including 4chan, will allow themselves to be deputized by a hostile foreign government which wants to censor its own people,” he wrote. “Ofcom has the power, if it wants, to get a court order and serve that order on UK-based ISPs to DNS block 4chan. That is entirely a domestic UK matter for Ofcom and the British courts to decide upon.”

If the suit — or Trump administration intervention — favors 4chan and other Ofcom targets, the result could be a blow against the DSA, OSA, and similar laws.

“I think what makes it most interesting in this case,” Mueller added, “is that the US government, apparently, [would be] backing 4Chan’s rights.”

Correction, August 23: a previous version of this article incorrectly stated that Rumble was a previous client of Byrne’s. He has not represented Rumble and currently does not.

24 CommentsFollow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

  • Tina NguyenClose

    Tina Nguyen

    Senior Reporter, Washington

    Posts from this author will be added to your daily email digest and your homepage feed.

    PlusFollow

    See All by Tina Nguyen

  • PolicyClose

    Policy

    Posts from this topic will be added to your daily email digest and your homepage feed.

    PlusFollow

    See All Policy

  • PoliticsClose

    Politics

    Posts from this topic will be added to your daily email digest and your homepage feed.

    PlusFollow

    See All Politics

  • Social MediaClose

    Social Media

    Posts from this topic will be added to your daily email digest and your homepage feed.

    PlusFollow

    See All Social Media

  • TechClose

    Tech

    Posts from this topic will be added to your daily email digest and your homepage feed.

    PlusFollow

    See All Tech



Source link

August 24, 2025 0 comments
0 FacebookTwitterPinterestEmail
Louisiana sues Roblox for allegedly choosing "profits over child safety", claiming "it's basically open season for sex predators on this app"
Game Updates

Louisiana sues Roblox for allegedly choosing “profits over child safety”, claiming “it’s basically open season for sex predators on this app”

by admin August 17, 2025


The U.S. state of Louisiana is suing Roblox, alleging it facilitates “the sexual exploitation of Louisiana’s children”.

In a statement, Louisiana Attorney General, Liz Murrill, claimed Roblox “endangers the safety of the children” of the state, writing: “Roblox is overrun with harmful content and child predators because it prioritises user growth, revenue, and profits over child safety.

“Every parent should be aware of the clear and present danger posed to their children by Roblox so they can prevent the unthinkable from ever happening in their own home.”

The legal papers then names several “highly inappropriate” Roblox mini games such as Escape to Epstein Island, Public Bathroom Simulator, and Diddy Party.

“These games and others are often filled with sexually explicit material and simulated sexual activity such as child gang rape. A recent report even revealed a group of 3,334 members openly traded child pornography and solicited sexual acts from minors,” the Louisiana announcement says, citing a 2024 report.

Roblox is violating Louisiana law – choosing profits over child safety. It’s basically open season for sex predators on this platform. pic.twitter.com/fGSQ8IFgWw

— Attorney General Liz Murrill (@AGLizMurrill) August 15, 2025

To see this content please enable targeting cookies.

Manage cookie settings

The lawsuit claims that while Roblox requires children to have permission from their parents or guardians to open an account, the company “does nothing to confirm or document that parental permission has been given, no matter how young a child is. Nor does Defendant require a parent to confirm the age given when a child signs up to use Roblox”.

“[Roblox] has access to biometric age verification software that requires the user to take a photo of a government-issued ID along with a real-time selfie photo that is then verified through artificial intelligence,” the AG adds. “However, while Defendant utilises this software for other purposes, Defendant intentionally does not utilise this feature when new accounts are created.”

In a press conference announcing the lawsuit, Murrill said: “So [Roblox] have chosen profits over child safety. It’s basically open season for sex predators on this app. For this reason, and all of the others that we’ve talked about today, Roblox is violating Louisiana law, and that’s why we filed this lawsuit.”

Roblox said it does not comment on pending litigation, but stressed “it would like to address erroneous claims and misconceptions about our platform, our commitment to safety, and our overall safety track record”.

“Every day, tens of millions of people around the world use Roblox to learn STEM skills, play, and imagine, and have a safe experience on our platform. Any assertion that Roblox would intentionally put our users at risk of exploitation is simply untrue. No system is perfect, and bad actors adapt to evade detection, including efforts to take users to other platforms, where safety standards and moderation practices may differ. We continuously work to block those efforts and to enhance our moderation approaches to promote a safe and enjoyable environment for all users.”

It added that it is constantly innovating safety tools and launching new safeguards, has taken an industry-leading stance on age-based communication, and serves players of which 64 percent are aged 13 or over. It also dedicates substantial resources to help detect and prevent inappropriate content and behaviour, and collaborates with law enforcement and government agencies, as well as mental health organisations, child safety organisations, and parental advocacy groups “to keep users safe on the platform”.

“We know safety is critically important to families, and we strive to empower our community of parents and caregivers to help ensure a safe online experience for their children. This includes a suite of easy-to-use parental controls to provide parents with more control and clarity on what their kids and teens are doing on Roblox,” the statement concludes.

“We aim to create one of the safest online environments for users, a goal not only core to our founding values but contrary to certain assertions, one we believe is critical to our long-term vision and success. We understand there is always more work to be done, and we are committed to making Roblox a safe and positive environment for all users.”





Source link

August 17, 2025 0 comments
0 FacebookTwitterPinterestEmail
Pebblebee Is Getting Serious About Personal Safety Tracking
Product Reviews

Pebblebee Is Getting Serious About Personal Safety Tracking

by admin August 17, 2025


Think of Bluetooth trackers and safety in the past few years and your first thought might be the misuse of Apple AirTags and similar devices against women in domestic abuse and stalking cases.

Alongside collaborative initiatives to counter and shut down these malicious uses (such as the IETF’s Detection of Unwanted Location Trackers, or DULT, standard), tracker makers themselves are flipping the script, turning tech that has been used to monitor women against their will into tech that protects them.

In mid-July, Seattle-based Pebblebee announced a new, free SOS safety feature, named Alert, for its $35 Clip Bluetooth tracker which, like the rest of its Universal line-up, can be set up with either Apple’s Find My or Google’s Find Hub networks, is made from around 30 percent recycled plastic and runs on a rechargeable battery, with 12+ months between USB-C charges.

When multi-pressed (six-plus presses), the Clip can trigger a siren alarm, flashing LED lights and automatically send an SMS text notification to one “Safety Circle” contact that you’ve pre-saved in the Pebblebee app.

Courtesy of Peebblebee

It’s simple to set up the contact in the app, and a long press on the device shuts off the siren and LEDs, though it’s unlikely you’ll fumble or accidentally set this one off. The Clip’s siren isn’t as loud and the lights don’t cover as much radius as a dedicated personal alarm would, but they’re enough to alert passersby when out walking at night. Plus, the Pebblebee website states: “The Alert functionality including the siren, strobe, and first Safety Circle contact is and will always be completely free.”

We tested out the SOS system and it worked without a hiccup. Our Safety Circle buddy received a text saying: “URGENT: Sophie activated a Pebblebee Alert. Please check in immediately” with a link to the correct location via Google Maps. Clicking the “Mark as safe” button in the app and/or long-pressing on the Clip sends a follow-up text to say “Sophie cancelled their Pebblebee Alert” with the last location link.

Now, however, Pebblebee is adding a paid-for subscription option, named Alert Live, which offers a Safety Circle of up to five contacts to receive the SOS text notification when triggered via the Clip, plus live location tracking for these contacts, for $3 a month or $26 a year. There’s also a new Silent Mode, which sends the alert without the siren and LEDs, for both free and paid-for users: useful, though we haven’t had the chance to test this or the new real-time location sharing out yet.



Source link

August 17, 2025 0 comments
0 FacebookTwitterPinterestEmail
Roblox
Esports

Petition to remove Roblox CEO reaches 100k signatures amid child safety concerns

by admin August 17, 2025



A petition calling for the resignation of Roblox CEO David Baszucki has gained over 100,000 signatures amid growing concerns over child safety on the platform.

The petition, started on August 9, 2025, accused Roblox of failing to protect children, alleging that the company has repeatedly overlooked issues of exploitation and harmful content.

It further claimed that, under Baszucki’s leadership, the safety of minors had been compromised while accountability and transparency remained lacking.

The petition further alleged that Roblox had been slow to remove predators, allowing some to remain active on the platform even after they were reported. It also claimed the company permitted harmful content to circulate while over-relying on automated moderation that punishes innocent players.

Article continues after ad

Additional complaints included allegations that community members had faced retaliation and that leadership had ignored mounting criticism and calls for reform.

Concern over the safety of children in Roblox has grown since the banning of YouTuber and predator catcher Schlep’s ban from the platform.

Roblox leadership faces increasing pressure

While the petition provides no specific names, Schlep, a YouTuber known for confronting alleged predators on Roblox, was banned from the platform and sent a cease and desist notice by the company, threatening legal action if they didn’t halt “unauthorized and harmful activities on the Roblox platform.”

Article continues after ad

Roblox has since defended the decision to ban Schlep, describing their actions as vigilantism. In an August 16 video providing an update on the company’s safety initiatives, Roblox Chief Safety Officer Matt Kaufman stated that such vigilante activity put the community at “greater risk.”

Article continues after ad

On August 15, To Catch a Predator host Chris Hansen confirmed in an X/Twitter post that he was investigating “exploitation of children on the popular gaming platform, Roblox,” and had been in contact with law enforcement as well as Schlep.



Source link

August 17, 2025 0 comments
0 FacebookTwitterPinterestEmail
"We believe these restrictions harm creative expression." The reaction to the UK's Online Safety Act
Esports

“We believe these restrictions harm creative expression.” The reaction to the UK’s Online Safety Act

by admin August 17, 2025


“This is not a law fit for purpose,” says the journalist and game developer John Szczepaniak. “This is idiocy and insanity of the highest order.”

Szczepaniak made the game Lady Priest Lawnmower as a joke – riffing on the ZX Spectrum’s similarly silly Advanced Lawnmower Simulator. But when the UK’s Online Safety Act (OSA) came into effect in late July, he found that British users of itch.io could no longer access his author page.

“It’s all just a parody,” Szczepaniak says. “But as you can see Lady Priest Lawnmower is deemed adult, and if only one game is deemed adult your entire profile page is blocked.” He believes the game tripped an alarm because it features kidnapping. “What about the original Donkey Kong, where Pauline is kidnapped?”

Lady Priest Lawnmower on Itch

Leaf Corcoran, itch.io’s founder, has said that author pages containing NSFW or adult content will remain blocked in the UK – until the site finds a ‘digital ID’ partner that can provide an age verification solution they’re happy with. In the meantime, itch.io is encouraging developers to submit an appeal if they think they’ve been incorrectly targeted. “I refuse to do this. This entire OSA banning nonsense should never have taken place,” Szczepaniak says. “I want the OSA laws repealed!”

The OSA is a set of laws intended to protect users online. It puts a new onus on game developers and platform holders to prevent children from accessing anything harmful or age-inappropriate. It requires that parents and kids are given clear and easy ways to report problems, and that adults be given more control over the type of content they see.

Frustration and panic

This change has been a long time coming – visible on the horizon and well-signalled by the UK government – but its arrival has led to a wave of frustration and panic among those who make games and run their associated communities. “While we will always comply with legal requirements, we disagree with this policy’s approach,” writes itch.io’s Corcoran. “We believe these restrictions harm creative expression and make it harder for independent creators to reach their audiences.”

Ofcom, the UK’s independent regulator of online safety and enforcer of the OSA, now has dedicated members of staff who are focused on and engaging with games companies.

“I think that’s possibly why we as an industry feel a bit more exposed, just because this is one of the first times that a regulator has paid attention to us from day one,” says Isabel Davies, a senior associate at the tech-focused law firm Wiggin. “Whereas normally what happens is social media companies get hit with a new piece of legislation, and we get somewhat taken along for the ride.”

Since the video-game boom of the COVID-19 pandemic, governments and regulators have started paying special attention to the interactive arts. “We’re on a lot more people’s radars,” Davies says. “I think the OSA is just a prime example of one of those situations.”

The Act was passed in 2023, and Ofcom has been consulting with companies inside and outside the games industry ever since. “We weren’t completely caught off-guard,” Davies says. But in the last few weeks, a requirement for companies to protect children from certain ‘legal but harmful’ content has come into force.

“This is one of the first times that a regulator has paid attention to us from day one”

Isabel Davies, Wiggin

“That was also the same time that pornography sites were told to start age-blocking kids, which is why I think this has caused such a kerfuffle,” Davies says. “And one of the things that I think has been oversimplified is that you see some commentators out there saying you have to do age assurance in all cases.”

Age-gating might be a great help in compliance with the law, but in many instances, it may also be overkill – even for game services that include user-generated content, chat, and community features.

“What you do have to do are your risk assessments,” Davies says. “Assess your risks properly and work out what measures you need to employ that may or may not involve age assurance. There may be other ways you can achieve certain goals to protect people.” If a games company is already employing great moderation tools and parental controls, for instance, it might meet many of its obligations that way. “So it’s really important for any service, including games, to not jump the gun with any of this.”

John Szczepaniak’s Itch author page is blocked in the UK

Even when age-gating is necessary, there’s room for nuance. One example of a thoughtful approach to compliance is Newgrounds, the venerable browser game portal. Despite missing Ofcom’s most recent deadline, the site has been working with the UK regulator for the past year. Its plan involves a number of smart assumptions – for instance, that any UK user with an account more than ten years old or access to a credit card is already over the age of 18. “Regardless of age verification, these overhauls have been benefitting the site with better performance and will make NG easier to maintain into the future,” says founder Tom Fulp.

As Fulp notes, however, this invention was born of sheer necessity in the face of more expensive solutions. “We are not planning to offer things like ID checks or facial recognition because these require us to pay a third party to confirm each person,” he writes. “Because Newgrounds runs at a loss and doesn’t monetize users very well, this is not an option for us. As Wired noted, Big Tech is the only winner of the Online Safety Act because smaller websites can’t afford to keep up with this sort of regulation.”

Administrative burden

One of the louder criticisms of the OSA is that it’s particularly unfriendly to smaller companies, for whom simply parsing the thousands of pages of official guidance is a lengthy and disruptive process. “Certainly for me as a lawyer, I’m aware that there is a lot to get through,'” Davies says. “So as someone who isn’t in this area, I can completely understand why they’re probably thinking, ‘What is this!?'”

It’s perhaps not surprising that this administrative overwhelm – along with the prospect of fines capping at £18 million or 10% of annual global turnover, whichever is higher – has frightened some companies into temporarily suspending services in the UK while they figure out the details. And it’s important to note that the OSA arrives against a backdrop of wider moderation and censorship concerns. Platforms like itch.io have been scrambling to address the complaints of prudish payment processors, which has led to some developers suffering a double blow when it comes to discoverability.

Robert Yang, whose games about gay culture sometimes involve nudity, was already subject to a delisting on the itch.io store. And in the course of researching this piece, GamesIndustry.biz discovered that his creator page is currently inaccessible in the UK as well. “I wasn’t aware,” Yang says. “I’m obviously not happy. I have plenty of games that aren’t adult games too.”

Such shotgun measures only feed fears that spaces for risk-taking art are being squeezed, and that the ability of video games to carry messages will suffer as a result. “My silly little amateur games are an insignificant casualty in a much greater fire that has obliterated freedom of expression and freedom of thought in the UK,” Szczepaniak says.

“When GDPR came out in 2018 there was a massive panic, and it took everyone a while to get their heads around things”

Isabel Davies, Wiggin

Yet Davies hopes that in the long term, working with the Act will become more straightforward. “When GDPR came out in 2018 there was a massive panic, and it took everyone a while to get their heads around things,” she says. “My hope is that as time goes on, compliance will get a bit easier. It will become a bit more of a known thing. People will have gone through the process. But as of right now, I think for many indies it will feel like a big burden. Which is why it’s important to speak to your trade bodies, your advisors and communities about this.”

Davies recommends the digital tools that Ofcom has published on its website to help navigate the risk assessment process. “I would say it’s a starting point, it’s definitely not the be-all-and-end-all,” she says. “But it’s a really helpful way to get your head around, ‘OK, what is Ofcom expecting to see? And how do I assess the risks of someone trying to recruit another user for terrorism in my service, for example?'”

As scary as the Act can seem, small businesses shouldn’t worry that they’re suddenly going to be shut down by an unexpected fine. “Ultimately, Ofcom isn’t expecting everyone to have everything resolved immediately,” Davies says. “It’s certainly at the period now where it seems to be doing some enforcement against certain sectors, but equally, in games it’s currently here to engage and help businesses understand what they should be doing.”

Time to assess

If a company’s service presents a big risk, then it might be wise to pause it. But plenty of companies might have less to do than they think.

“If you’ve had a long history of your forum running into issues with illegal content, then maybe turn it off for now until you know what you need to do,” Davies says. “But if you’re running a small forum which is used by a relatively small number of people, and the conversations are mainly about your game or bug tickets or some fan art that people have drawn, you would hope it’s probably going to be relatively low risk in practice. Again – get your risk assessments done!”

“Thanks to the OSA, I’m being treated as some sort of pornographer”

John Szczepaniak

If a time is coming when game platforms will find a more harmonious balance with the OSA, for the benefit of both creators and fans, it can’t come soon enough. In our current moment, rushed and overbearing implementations of the law are leading to upset and disillusionment among the very creative minds our industry depends on.

“Itch is an escape from reality, and an escape from the corporate nature of triple-A gaming,” Szczepaniak says. “None of my individual games have had more than 200 downloads. But making them is fun for me. Yet thanks to the OSA, I’m being treated as some sort of pornographer? Some sort of pariah that needs to be kept away from society to keep it safe?

“I feel deeply saddened that I am banned in the UK.”



Source link

August 17, 2025 0 comments
0 FacebookTwitterPinterestEmail
DAAPrivacyRightIcon
Gaming Gear

Roblox cracks down on its user-created content following multiple child safety lawsuits

by admin August 17, 2025


Following a wave of lawsuits alleging that Roblox doesn’t provide a safe environment for its underage users, the gaming platform made a series of sweeping updates to its policies. To address recent concerns, Roblox published a post on its website detailing these major changes, including restricting all unrated experiences, which is what Roblox calls its user-generated games, to the developer or those actively working with them. Roblox said this change will roll out in the coming months, representing a big shift from its previous policy that allowed users 13 or older to access unrated experiences.

To further prevent any inappropriate behavior, any “social hangout” experiences that depict private spaces, like bedrooms or bathrooms, will be limited to ID-verified users who are 17 or older. Roblox will also restrict social hangout games that mostly take place in those previously-mentioned private spaces or adult-only places, like bars or clubs, to users who are at least 17 and have been ID-verified. To assist with the new rules, Roblox will roll out a new tool that automatically detects “violative scenes,” or more simply, user activity that goes against the rules. According to Roblox’s new policies, a server that hits enough violations will automatically get taken down and will have to work with the Roblox team to adjust the experience and get it back online.

These policy changes come after several lawsuits were filed against Roblox that claim the game doesn’t protect its younger users. In response to the lawsuit filed by Louisiana’s attorney general, Roblox wrote in a separate post that it works to block any efforts at exploitative behavior and constantly enhances its moderation approaches.

“Any assertion that Roblox would intentionally put our users at risk of exploitation is simply untrue,” the company statement read. “No system is perfect and bad actors adapt to evade detection, including efforts to take users to other platforms, where safety standards and moderation practices may differ.



Source link

August 17, 2025 0 comments
0 FacebookTwitterPinterestEmail
Tesla's inaugural Robotaxi rides will have a human 'safety monitor' on board
Product Reviews

Tesla’s inaugural Robotaxi rides will have a human ‘safety monitor’ on board

by admin June 22, 2025


A select few will soon get to experience Tesla’s robotaxi service for the first time, but they won’t be alone in the car. The company plans to launch its fully autonomous ride-hailing service in Austin, Texas tomorrow, but a “Tesla Safety Monitor” will accompany the first riders, according to email invitations sent out to “Early Access Riders.” It’s unclear what capabilities the safety monitor will have, but they will sit in the front passenger seat of a self-driving Model Y.

The email outlined several parameters, including that users were limited to a geofenced area that excludes airports, could run into unavailability due to bad weather and can only hail a robotaxi between 6 am and midnight. This restrictive launch will reportedly only offer 10 cars and comes after a delay from an initial launch date on June 12.

With the official date set, Tesla will only offer its robotaxi service with its Model Y for now. This robotaxi service will lay the groundwork for an eventual Cybercab release, which isn’t expected to start production until at least 2026, according to the company. For now, Tesla’s robotaxi service will face competition from Waymo, which started offering its competiting services in March to Austin residents.



Source link

June 22, 2025 0 comments
0 FacebookTwitterPinterestEmail
Decrypt logo
NFT Gaming

‘Users Need Safety’: Former SpaceX Manager Raises $4.2M for Crypto Platform Stackup

by admin June 20, 2025



In brief

  • A former mission manager at SpaceX has raised $4.2 million for crypto startup Stackup.
  • The platform seeks to give businesses “centralized control of decentralized assets” using account abstraction.
  • The company’s seed funding round saw participation from Y Combinator and Digital Currency Group.

Stackup, an enterprise-grade platform for managing on-chain business operations, said on Friday that it secured $4.2 million in seed funding.

The Los Angeles-based firm, which previously built account abstraction tech for Coinbase and Trust Wallet, said it will use the funds to further develop its platform, which gives businesses “centralized control of decentralized assets,” according to the company’s website.

Stackup’s seed funding round was led by venture capital firm 1kx, with participation from Y Combinator and Digital Currency Group, among three other firms. 

Stackup was co-founded by CTO Hazim Jumali, an Ethereum Foundation grantee, and CEO John Rising, a former senior mission manager at Elon Musk’s SpaceX. In the overlap between crypto and aerospace engineering, Rising told Decrypt that precision is key.



Four months after he graduated from college, while working at Virgin Galactic, Rising was standing next to a pilot’s family when a mishap “caused the vehicle to break up in the sky.” Despite extensive designing and testing, the pilot lost his life.

“We had spent so much time making sure the machine works perfectly that we disregarded the most important part of the system as a whole: that a human is operating it,” Rising said. “Builders in crypto tend to really think about security, which is about preventing unauthorized access, when really users need safety.”

Account abstraction, which debuted on Ethereum’s mainnet in 2023, allows users to create non-custodial wallets as programmable smart contracts. Through features including easy wallet recovery and signless transactions, Ethereum co-founder Vitalik Buterin once said that the goal is to try and make crypto wallets as simple as using email.

Because account abstraction allows for wallets with custom logic, Rising said Stackup supports features that businesses need, such as spending limits for accounts, the ability to specify who can receive funds, or reviewing transactions in bulk before sending them with one click. 

Among other features, like the ability to connect a bank account to a wallet for seamless transfers, the goal is for Stackup to prevent catastrophic on-chain failures, Rising said.

For the average SpaceX launch, $60 million worth of cargo may be on the line, Rising said. Earlier this year, a security incident at crypto exchange Bybit resulted in a $1.4 billion hack.

“Imagine if SpaceX had 20 launch failures?” he asked. “It’s the same amount of loss in dollar terms.”

Although 3,000 individuals are involved in a SpaceX launch, Rising said that the team at Stackup is much more limited. Currently, the company is piloted by a group of four.

Edited by James Rubin

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.



Source link

June 20, 2025 0 comments
0 FacebookTwitterPinterestEmail
Period Data ‘Gold Mine’ Poses Serious Health and Safety Risks, Report Finds
Gaming Gear

Period Data ‘Gold Mine’ Poses Serious Health and Safety Risks, Report Finds

by admin June 10, 2025


Apps that help people track their menstrual cycle are data “gold mines” for advertisers, a new report warns. Advertisers use this highly valuable data for customer profiling, allowing them to tailor marketing campaigns to specific groups of consumers.

The report, published by the University of Cambridge’s Minderoo Centre for Technology and Democracy on Tuesday, June 10, explains that the risks to app users go far beyond just targeted ads. When this data falls into the wrong hands, it can affect users’ job prospects and lead to workplace surveillance, health insurance discrimination, and cyberstalking. It has even been used to limit access to abortion in the U.S., the study warns.

Hundreds of millions of people use period tracking apps. A 2024 study estimated that the number of global downloads for the three most popular apps exceeds 250 million. These platforms are run by companies that profit from the mountain of user data they collect—particularly pregnancy data. According to the University of Cambridge report, data on pregnancy is 200 times more valuable to advertisers than data on age, gender, or location.

Investigations conducted in 2019 and 2020 by Privacy International, a U.K.-based nonprofit, found that multiple apps directly shared personal data with advertisers. A follow-up study published on May 28 found that while major menstrual app companies have improved their approach to data privacy, they still collect device data from users in the U.K. and U.S. with “no meaningful consent.”

Stefanie Felsberger, sociologist and lead author of the University of Cambridge report, interviewed period tracking app users in Austria to understand why they use them and what they track. She was surprised to find that many people she spoke with didn’t think of their menstrual data as personal or intimate and were unaware of its incredible commercial value.

“Period tracking apps collect a vast number of different kinds of information,” Felsberger told Gizmodo. “They don’t just collect information about the menstrual cycle as such, they also collect information about people’s reproductive choices, sexual activities, their wellbeing, health, [and] medication intake,” she said. These apps also gather background information about users, including their age, gender, IP addresses, app behavior, and device information, she added.

“We have limited and also changing knowledge about how and where this data has been shared and who has access to it,” Felsberger said.

In the U.S., menstrual tracking apps are regulated as general wellness devices, so the data they collect don’t get any special legal protections, she explained. Advertisers aren’t the only ones who can exploit this lack of safeguarding to access menstrual data. Government officials can also get their hands on this information and use it to restrict abortion access.

“Menstrual tracking data is being used to control people’s reproductive lives.”

Felsberger’s report highlights two such cases, though in these instances, menstrual data did not come specifically from period tracking apps. Still, they illustrate how governments can use this information to limit access to abortion at both state and federal levels.

In 2019, Missouri’s state health department used menstrual tracking data to investigate failed abortions. They also tracked patients’ medical ID numbers, the gestational age of fetuses, and the dates of medical procedures. As a result of this investigation, the state attempted to withhold the license of St. Louis’ Planned Parenthood clinic—the only abortion provider in the state at that time. This led to a year-long legal battle that ultimately restored the clinic’s license.

During President Donald Trump’s first administration, the federal Office of Refugee Resettlement tracked the menstrual cycles of unaccompanied minors seeking asylum in the U.S. They aimed to prevent these minors from obtaining abortions even in cases of rape. A freedom of information request by MSNBC uncovered a spreadsheet containing dates of the minors’ menstrual cycles, lengths of their pregnancies, whether the sex had been consensual, and whether they had requested an abortion. 

These cases underscore the dangers of failing to protect users’ period tracking data, especially in a post-Dobbs world. Since Roe v. Wade was overturned in 2022, abortion access has become deeply fragmented across the U.S. This procedure is currently banned in 13 states and access is significantly limited in an additional 11 states.

In the European Union and the U.K., period tracking apps have more legal protections. “But they are not often implemented very well,” Felsberger said. Their privacy policies tend to be “very vague,” which makes it difficult for users to understand who can access their data.

“App developers and companies have a very large responsibility, because they do present themselves as providing people with this opportunity to learn about their menstrual cycles,” she said. “I think they should also do their utmost to keep people’s data safe and be transparent about the way that they use data.” There is also a need for stronger federal regulations, especially in the U.S., she added.

Given that these apps offer valuable health insights, it’s unrealistic to expect users to stop using them entirely. But Felsberger recommends switching to non-commercial period tracking apps that provide more data privacy. These platforms are run by non-profit organizations or research institutions that won’t share your information with third parties.

As the landscape of reproductive health becomes increasingly treacherous in the U.S., understanding how third parties may exploit your menstrual data has never been more important.

“Menstrual tracking data is being used to control people’s reproductive lives,” Felsberger said in a University statement. “It should not be left in the hands of private companies.”



Source link

June 10, 2025 0 comments
0 FacebookTwitterPinterestEmail
  • 1
  • 2

Categories

  • Crypto Trends (1,019)
  • Esports (765)
  • Game Reviews (697)
  • Game Updates (895)
  • GameFi Guides (1,011)
  • Gaming Gear (965)
  • NFT Gaming (993)
  • Product Reviews (954)
  • Uncategorized (1)

Recent Posts

  • Today’s NYT Connections: Sports Edition Hints, Answers for Aug. 25 #336
  • Saints release veteran RB Cam Akers in initial cuts
  • AI Titans Back $100 Million Super PAC to Boost Industry’s Status in Washington
  • Everything We Learned About Single Player And Multiplayer In Illfonic’s Halloween
  • The 38 Best Deals From REI’s 2025 Labor Day Sale

Recent Posts

  • Today’s NYT Connections: Sports Edition Hints, Answers for Aug. 25 #336

    August 25, 2025
  • Saints release veteran RB Cam Akers in initial cuts

    August 25, 2025
  • AI Titans Back $100 Million Super PAC to Boost Industry’s Status in Washington

    August 25, 2025
  • Everything We Learned About Single Player And Multiplayer In Illfonic’s Halloween

    August 25, 2025
  • The 38 Best Deals From REI’s 2025 Labor Day Sale

    August 25, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

About me

Welcome to Laughinghyena.io, your ultimate destination for the latest in blockchain gaming and gaming products. We’re passionate about the future of gaming, where decentralized technology empowers players to own, trade, and thrive in virtual worlds.

Recent Posts

  • Today’s NYT Connections: Sports Edition Hints, Answers for Aug. 25 #336

    August 25, 2025
  • Saints release veteran RB Cam Akers in initial cuts

    August 25, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 laughinghyena- All Right Reserved. Designed and Developed by Pro


Back To Top
Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop

Shopping Cart

Close

No products in the cart.

Close