Loot boxes in games are gambling and should be banned for kids, say UK MPs – gpgmail


UK MPs have called for the government to regulate the games industry’s use of loot boxes under current gambling legislation — urging a blanket ban on the sale of loot boxes to players who are children.

Kids should instead be able to earn in-game credits to unlock look boxes, MPs have suggested in a recommendation that won’t be music to the games industry’s ears.

Loot boxes refer to virtual items in games that can be bought with real-world money and do not reveal their contents in advance. The MPs argue the mechanic should be considered games of chance played for money’s worth and regulated by the UK Gambling Act.

The Department for Digital, Culture, Media and Sport’s (DCMS) parliamentary committee makes the recommendations in a report published today following an enquiry into immersive and addictive technologies that saw it take evidence from a number of tech companies including Fortnite maker Epic Games; Facebook-owned Instagram; and Snapchap.

The committee said it found representatives from the games industry to be “wilfully obtuse” in answering questions about typical patterns of play — data the report emphasizes is necessary for proper understanding of how players are engaging with games — as well as calling out some games and social media company representatives for demonstrating “a lack of honesty and transparency”, leading it to question what the companies have to hide.

“The potential harms outlined in this report can be considered the direct result of the way in which the ‘attention economy’ is driven by the objective of maximising user engagement,” the committee writes in a summary of the report which it says explores “how data-rich immersive technologies are driven by business models that combine people’s data with design practices to have powerful psychological effects”.

As well as trying to pry information about of games companies, MPs also took evidence from gamers during the course of the enquiry.

In one instance the committee heard that a gamer spent up to £1,000 per year on loot box mechanics in Electronic Arts’s Fifa series.

A member of the public also reported that their adult son had built up debts of more than £50,000 through spending on microtransactions in online game RuneScape. The maker of that game, Jagex, told the committee that players “can potentially spend up to £1,000 a week or £5,000 a month”.

In addition to calling for gambling law to be applied to the industry’s lucrative loot box mechanic, the report calls on games makers to face up to responsibilities to protect players from potential harms, saying research into possible negative psychosocial harms has been hampered by the industry’s unwillingness to share play data.

“Data on how long people play games for is essential to understand what normal and healthy — and, conversely, abnormal and potentially unhealthy — engagement with gaming looks like. Games companies collect this information for their own marketing and design purposes; however, in evidence to us, representatives from the games industry were wilfully obtuse in answering our questions about typical patterns of play,” it writes.

“Although the vast majority of people who play games find it a positive experience, the minority who struggle to maintain control over how much they are playing experience serious consequences for them and their loved ones. At present, the games industry has not sufficiently accepted responsibility for either understanding or preventing this harm. Moreover, both policy-making and potential industry interventions are being hindered by a lack of robust evidence, which in part stems from companies’ unwillingness to share data about patterns of play.”

The report recommends the government require games makers share aggregated player data with researchers, with the committee calling for a new regulator to oversee a levy on the industry to fund independent academic research — including into ‘Gaming disorder‘, an addictive condition formally designated by the World Health Organization — and to ensure that “the relevant data is made available from the industry to enable it to be effective”.

“Social media platforms and online games makers are locked in a relentless battle to capture ever more of people’s attention, time and money. Their business models are built on this, but it’s time for them to be more responsible in dealing with the harms these technologies can cause for some users,” said DCMS committee chair, Damian Collins, in a statement.

“Loot boxes are particularly lucrative for games companies but come at a high cost, particularly for problem gamblers, while exposing children to potential harm. Buying a loot box is playing a game of chance and it is high time the gambling laws caught up. We challenge the Government to explain why loot boxes should be exempt from the Gambling Act.

“Gaming contributes to a global industry that generates billions in revenue. It is unacceptable that some companies with millions of users and children among them should be so ill-equipped to talk to us about the potential harm of their products. Gaming disorder based on excessive and addictive game play has been recognised by the World Health Organisation. It’s time for games companies to use the huge quantities of data they gather about their players, to do more to proactively identify vulnerable gamers.”

The committee wants independent research to inform the development of a behavioural design code of practice for online services. “This should be developed within an adequate timeframe to inform the future online harms regulator’s work around ‘designed addiction’ and ‘excessive screen time’,” it writes, citing the government’s plan for a new Internet regulator for online harms.

MPs are also concerned about the lack of robust age verification to keep children off age-restricted platforms and games.

The report identifies inconsistencies in the games industry’s ‘age-ratings’ stemming from self-regulation around the distribution of games (such as online games not being subject to a legally enforceable age-rating system, meaning voluntary ratings are used instead).

“Games companies should not assume that the responsibility to enforce age-ratings applies exclusively to the main delivery platforms: All companies and platforms that are making games available online should uphold the highest standards of enforcing age-ratings,” the committee writes on that.

“Both games companies and the social media platforms need to establish effective age verification tools. They currently do not exist on any of the major platforms which rely on self-certification from children and adults,” Collins adds.

During the enquiry it emerged that the UK government is working with tech companies including Snap to try to devise a centralized system for age verification for online platforms.

A section of the report on Effective Age Verification cites testimony from deputy information commissioner Steve Wood raising concerns about any move towards “wide-spread age verification [by] collecting hard identifiers from people, like scans of passports”.

Wood instead pointed the committee towards technological alternatives, such as age estimation, which he said uses “algorithms running behind the scenes using different types of data linked to the self-declaration of the age to work out whether this person is the age they say they are when they are on the platform”.

Snapchat’s Will Scougal also told the committee that its platform is able to monitor user signals to ensure users are the appropriate age — by tracking behavior and activity; location; and connections between users to flag a user as potentially underage. 

The report also makes a recommendation on deepfake content, with the committee saying that malicious creation and distribution of deepfake videos should be regarded as harmful content.

“The release of content like this could try to influence the outcome of elections and undermine people’s public reputation,” it warns. “Social media platforms should have clear policies in place for the removal of deepfakes. In the UK, the Government should include action against deepfakes as part of the duty of care social media companies should exercise in the interests of their users, as set out in the Online Harms White Paper.”

“Social media firms need to take action against known deepfake films, particularly when they have been designed to distort the appearance of people in an attempt to maliciously damage their public reputation, as was seen with the recent film of the Speaker of the US House of Representatives, Nancy Pelosi,” adds Collins.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Reps from DHS, the FBI and the ODNI met with tech companies at Facebook to talk election security – gpgmail


Representatives from the Federal Bureau of Investigation, the Office of the Director of National Intelligence and the Department of Homeland Security met with counterparts at tech companies including Facebook, Google, Microsoft and Twitter to discuss election security, Facebook confirmed.

The purpose was to build on previous discussions and further strengthen strategic collaboration regarding the security of the 2020 U.S. state, federal, and presidential elections,” according to a statement from Facebook head of cybersecurity policy, Nathaniel Gleicher.

First reported by Bloomberg, the meeting between America’s largest technology companies and the trio of government security agencies responsible for election security is a sign of how seriously the government and the country’s largest technology companies are treating the threat of foreign intervention into elections.

Earlier this year the Office of the Inspector General issued a report saying that the Department of Homeland Security has not done enough to safeguard elections in the United States.

Throughout the year, reports of persistent media manipulation and the dissemination of propaganda on social media platforms have cropped up not just in the United States but around the world.

In April, Facebook removed a number of accounts ahead of the Spanish election for their role in spreading misinformation about the campaign.

Companies have responded to the threat by updating different mechanisms for users to call out fake accounts and improving in-house technologies used to combat the spread of misinformation.

Twitter, for instance, launched a reporting tool whereby users can flag misleading tweets.

“Improving election security and countering information operations are complex challenges that no organization can solve alone,” said Gleicher in a statement. “Today’s meeting builds on our continuing commitment to work with industry and government partners, as well as with civil society and security experts, to better understand emerging threats and prepare for future elections.”


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Reports say White House has drafted an order putting the FCC in charge of monitoring social media – gpgmail


In the executive order, the White House says it received more than 15,000 complaints about censorship by the technology platforms. The order also includes an offer to share the complaints with the Federal Trade Commission.

As part of the order, the Federal Trade Commission would be required to open a public complaint docket and coordinate with the Federal Communications Commission on investigations of how technology companies curate their platforms — and whether that curation is politically agnostic.

Under the proposed rule, any company whose monthly user base includes more than one-eighth of the U.S. population would be subject to oversight by the regulatory agencies. A roster of companies subject to the new scrutiny would include Facebook, Google, Instagram, Twitter, Snap and Pinterest .

At issue is how broadly or narrowly companies are protected under the Communications Decency Act, which was part of the Telecommunications Act of 1996. Social media companies use the Act to shield against liability for the posts, videos or articles that are uploaded from individual users or third parties.

The Trump administration aren’t the only politicians in Washington are focused on the laws that shield social media platforms from legal liability. House Speaker Nancy Pelosi took technology companies to task earlier this year in an interview with Recode.

The criticisms may come from different sides of the political spectrum, but their focus on the ways in which tech companies could use Section 230 of the Act is the same.

The White House’s executive order would ask the FCC to disqualify social media companies from immunity if they remove or limit the dissemination of posts without first notifying the user or third party that posted the material, or if the decision from the companies is deemed anti-competitive or unfair.

The FTC and FCC had not responded to a request for comment at the time of publication.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something