Loot boxes in games are gambling and should be banned for kids, say UK MPs – gpgmail


UK MPs have called for the government to regulate the games industry’s use of loot boxes under current gambling legislation — urging a blanket ban on the sale of loot boxes to players who are children.

Kids should instead be able to earn in-game credits to unlock look boxes, MPs have suggested in a recommendation that won’t be music to the games industry’s ears.

Loot boxes refer to virtual items in games that can be bought with real-world money and do not reveal their contents in advance. The MPs argue the mechanic should be considered games of chance played for money’s worth and regulated by the UK Gambling Act.

The Department for Digital, Culture, Media and Sport’s (DCMS) parliamentary committee makes the recommendations in a report published today following an enquiry into immersive and addictive technologies that saw it take evidence from a number of tech companies including Fortnite maker Epic Games; Facebook-owned Instagram; and Snapchap.

The committee said it found representatives from the games industry to be “wilfully obtuse” in answering questions about typical patterns of play — data the report emphasizes is necessary for proper understanding of how players are engaging with games — as well as calling out some games and social media company representatives for demonstrating “a lack of honesty and transparency”, leading it to question what the companies have to hide.

“The potential harms outlined in this report can be considered the direct result of the way in which the ‘attention economy’ is driven by the objective of maximising user engagement,” the committee writes in a summary of the report which it says explores “how data-rich immersive technologies are driven by business models that combine people’s data with design practices to have powerful psychological effects”.

As well as trying to pry information about of games companies, MPs also took evidence from gamers during the course of the enquiry.

In one instance the committee heard that a gamer spent up to £1,000 per year on loot box mechanics in Electronic Arts’s Fifa series.

A member of the public also reported that their adult son had built up debts of more than £50,000 through spending on microtransactions in online game RuneScape. The maker of that game, Jagex, told the committee that players “can potentially spend up to £1,000 a week or £5,000 a month”.

In addition to calling for gambling law to be applied to the industry’s lucrative loot box mechanic, the report calls on games makers to face up to responsibilities to protect players from potential harms, saying research into possible negative psychosocial harms has been hampered by the industry’s unwillingness to share play data.

“Data on how long people play games for is essential to understand what normal and healthy — and, conversely, abnormal and potentially unhealthy — engagement with gaming looks like. Games companies collect this information for their own marketing and design purposes; however, in evidence to us, representatives from the games industry were wilfully obtuse in answering our questions about typical patterns of play,” it writes.

“Although the vast majority of people who play games find it a positive experience, the minority who struggle to maintain control over how much they are playing experience serious consequences for them and their loved ones. At present, the games industry has not sufficiently accepted responsibility for either understanding or preventing this harm. Moreover, both policy-making and potential industry interventions are being hindered by a lack of robust evidence, which in part stems from companies’ unwillingness to share data about patterns of play.”

The report recommends the government require games makers share aggregated player data with researchers, with the committee calling for a new regulator to oversee a levy on the industry to fund independent academic research — including into ‘Gaming disorder‘, an addictive condition formally designated by the World Health Organization — and to ensure that “the relevant data is made available from the industry to enable it to be effective”.

“Social media platforms and online games makers are locked in a relentless battle to capture ever more of people’s attention, time and money. Their business models are built on this, but it’s time for them to be more responsible in dealing with the harms these technologies can cause for some users,” said DCMS committee chair, Damian Collins, in a statement.

“Loot boxes are particularly lucrative for games companies but come at a high cost, particularly for problem gamblers, while exposing children to potential harm. Buying a loot box is playing a game of chance and it is high time the gambling laws caught up. We challenge the Government to explain why loot boxes should be exempt from the Gambling Act.

“Gaming contributes to a global industry that generates billions in revenue. It is unacceptable that some companies with millions of users and children among them should be so ill-equipped to talk to us about the potential harm of their products. Gaming disorder based on excessive and addictive game play has been recognised by the World Health Organisation. It’s time for games companies to use the huge quantities of data they gather about their players, to do more to proactively identify vulnerable gamers.”

The committee wants independent research to inform the development of a behavioural design code of practice for online services. “This should be developed within an adequate timeframe to inform the future online harms regulator’s work around ‘designed addiction’ and ‘excessive screen time’,” it writes, citing the government’s plan for a new Internet regulator for online harms.

MPs are also concerned about the lack of robust age verification to keep children off age-restricted platforms and games.

The report identifies inconsistencies in the games industry’s ‘age-ratings’ stemming from self-regulation around the distribution of games (such as online games not being subject to a legally enforceable age-rating system, meaning voluntary ratings are used instead).

“Games companies should not assume that the responsibility to enforce age-ratings applies exclusively to the main delivery platforms: All companies and platforms that are making games available online should uphold the highest standards of enforcing age-ratings,” the committee writes on that.

“Both games companies and the social media platforms need to establish effective age verification tools. They currently do not exist on any of the major platforms which rely on self-certification from children and adults,” Collins adds.

During the enquiry it emerged that the UK government is working with tech companies including Snap to try to devise a centralized system for age verification for online platforms.

A section of the report on Effective Age Verification cites testimony from deputy information commissioner Steve Wood raising concerns about any move towards “wide-spread age verification [by] collecting hard identifiers from people, like scans of passports”.

Wood instead pointed the committee towards technological alternatives, such as age estimation, which he said uses “algorithms running behind the scenes using different types of data linked to the self-declaration of the age to work out whether this person is the age they say they are when they are on the platform”.

Snapchat’s Will Scougal also told the committee that its platform is able to monitor user signals to ensure users are the appropriate age — by tracking behavior and activity; location; and connections between users to flag a user as potentially underage. 

The report also makes a recommendation on deepfake content, with the committee saying that malicious creation and distribution of deepfake videos should be regarded as harmful content.

“The release of content like this could try to influence the outcome of elections and undermine people’s public reputation,” it warns. “Social media platforms should have clear policies in place for the removal of deepfakes. In the UK, the Government should include action against deepfakes as part of the duty of care social media companies should exercise in the interests of their users, as set out in the Online Harms White Paper.”

“Social media firms need to take action against known deepfake films, particularly when they have been designed to distort the appearance of people in an attempt to maliciously damage their public reputation, as was seen with the recent film of the Speaker of the US House of Representatives, Nancy Pelosi,” adds Collins.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Adarga closes £5M Series A funding for its Palantir-like AI platform – gpgmail


AI startup Adarga has closed a £5 million Series A fundraising by Allectus Capital. But this news rather cloaks the fact that it’s been building up a head of steam since it’s founding in 2016, building up – what they say – is a £30 million-plus sales pipeline through strategic collaborations with a number of global industrial partners and gradually building its management team.

The proceeds will be used to continue the expansion of Adarga’s data science and software engineering teams and roll out internationally.

Adarga, which comes from the word for an old Moorish shield, is a London and Bristol-based start-up. It uses AI to change the way financial institutions, intelligence agencies and defence companies tackle problems, helping crunch vast amounts of data to identify possible threats even before they occur. The start-up’s proposition sounds similar to that of Palantir, which is known for working with the US military.

What Adarga does is allow organizations to transform normally data-intensive, human knowledge processes by analyzing vast volumes of data more quickly and accurately. Adarga clients can build up a ‘Knowledge Graph’ about subjects, and targets.

The UK government is a client as well as the finance sector, where it’s used for financial analysis and by insurance companies. Founded in 2016, it now has 26 employees – including data scientists from some of the UK’s top universities.

The company has received support from Benevolent AI, one of the key players in the UK AI tech scene. Benevolent AI, which is worth $2bn after a $115m funding round, is a minority shareholder in Adarga. It has not provided financial backing, but support in kind and technical help.

Rob Bassett Cross, CEO of Adarga, commented: “With the completion of this round, Adarga is focused on consolidating its competitive position in the UK defence and security sector. We are positioning ourselves as the software platform of choice for organisations who cannot deal effectively with the scale and complexity of their enterprise data and are actively seeking an alternative to knowledge intensive human processes. Built by experienced sector specialists, the Company has rapidly progressed a real solution to address the challenges of an ever-growing volume of unstructured data.”

Bassett Cross is an interesting guy, to say the least. You won’t find much about him on LinkedIn, but in previous interviews, he has revealed that he is a former army officer and special operations expert who fought in Iraq and Afghanistan, and was awarded military cross.

The company recently held a new annual event, the Adarga AI Symposium at the The Royal Institution, London, which featured futurist Mark Stevenson, Ranju Das of Amazon Web Services, and General Stanley A. McChrystal.

Matthew Gould, Head of Emerging Technology at Allectus Capital, said: “Adarga has developed a world-class analytics platform to support real-time critical decisioning by public sector and defence stakeholders. What Rob and the team have built in a short time is a hugely exciting example of the founder-led, disruptive businesses that we like to partner with – especially in an ever-increasing global threat landscape.”

Allectus Capital is based in Sydney, Australia and invests across Asia-Pacific, UK and US. It has previously invested in Cluey Learning (Series A, A$20M), Everproof, Switch Automation and Automio.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

UK’s health data guardian sets a firm line for app development using patient data – gpgmail


The UK’s health data watchdog, the National Data Guardian (NDG), has published correspondence between her office and the national privacy watchdog which informed the ICO’s finding in 2017 that a data-sharing arrangement between an NHS Trust and Google-owned DeepMind broke the law.

The exchange was published following a Freedom of Information request by gpgmail.

In fall 2015 the Royal Free NHS Trust and DeepMind signed a data-sharing agreement which saw the medical records of 1.6 million people quietly passed to the AI company without patients being asked for their consent.

The scope of the data-sharing arrangement — ostensibly to develop a clinical task management app — was only brought to light by investigative journalism. That then triggered regulatory scrutiny — and the eventual finding by the ICO that there was no legal basis for the data to have been transferred in the first place.

Despite that, the app in question, Streams — which does not (currently) contain any AI but uses an NHS algorithm for detecting acute kidney injury — has continued being used in NHS hospitals.

DeepMind has also since announced it plans to transfer its health division to Google. Although — to our knowledge — no NHS trusts have yet signed new contracts for Streams with the ad giant.

In parallel with releasing her historical correspondence with the ICO, Dame Fiona Caldicott, the NDG, has written a blog post in which she articulates a clear regulatory position that the “reasonable expectations” of patients must govern non-direct care uses for people’s health data — rather than healthcare providers relying on whether doctors think developing such and such an app is a great idea.

The ICO had asked for guidance from the NDG on how to apply the common law duty of confidentiality, as part of its investigation into the Royal Free NHS Trust’s data-sharing arrangement with DeepMind for Streams.

In a subsequent audit of Streams that was a required by the regulator, the trust’s law firm, Linklaters, argued that a call on whether a duty of confidentiality has been breached should be judged from the point of view of the clinician’s conscience, rather than the patient’s reasonable expectations.

Caldicott writes that she firmly disagrees with that “key argument”.

“It is my firm view that it is the patient’s perspective that is most important when judgements are being made about the use of their confidential information. My letter to the Information Commissioner sets out my thoughts on this matter in some detail,” she says, impressing the need for healthcare innovation to respect the trust and confidence of patients and the public.

“I do champion innovative technologies and new treatments that are powered by data. The mainstreaming of emerging fields such as genomics and artificial intelligence offer much promise and will change the face of medicine for patients and health professionals immeasurably… But my belief in innovation is coupled with an equally strong belief that these advancements must be introduced in a way that respects people’s confidentiality and delivers no surprises about how their data is used. In other words, the public’s reasonable expectations must be met.”

“Patients’ reasonable expectations are the touchstone of the common law duty of confidence,” she adds. “Providers who are introducing new, data-driven technologies, or partnering with third parties to help develop and test them, have called for clearer guidance about respecting data protection and confidentiality. I intend to work with the Information Commissioner and others to improve the advice available so that innovation can be undertaken safely: in compliance with the common law and the reasonable expectations of patients.

“The National Data Guardian is currently supporting the Health Research Authority in clarifying and updating guidance on the lawful use of patient data in the development of healthcare technologies.”

We reached out to the Royal Free NHS Trust and DeepMind for comment on the NDG’s opinion. At the time of writing neither had responded.

In parallel, Bloomberg reported this week that DeepMind co-founder, Mustafa Suleyman, is currently on leave from the company. (Suleyman has since tweeted that the break is temporary and for “personal” reasons, to “recharge”, and that he’s “looking forward to being back in the saddle at DeepMind soon”.)

The AI research company recently touted what it couched as a ‘breakthrough’ in predictive healthcare — saying it had developed an AI model for predicting the same condition that the Streams app is intended to alert for. Although the model was built using US data from the Department of Veterans Affairs which skews overwhelmingly male.

As we wrote at the time, the episode underscores the potential value locked up in NHS data — which offers population-level clinical data that the NHS could use to develop AI models of its own. Indeed, a 2017 government-commissioned review of the life sciences sector called for a strategy to “capture for the UK the value in algorithms generated using NHS data”.

The UK government is also now pushing a ‘tech-first’ approach to NHS service delivery.

Earlier this month the government announced it’s rerouting £250M in public funds for the NHS to set up an artificial intelligence lab that will work to expand the use of AI technologies within the service.

Last fall health secretary, Matt Hancock, set out his tech-first vision of future healthcare provision — saying he wanted “healthtech” apps and services to support “preventative, predictive and personalised care”.

So there are certainly growing opportunities for developing digital healthcare solutions to support the UK’s National Health Service.

As well as — now — clearer regulatory guidance that app development that wants to be informed by patient data must first win the trust and confidence of the people it hopes to serve.




10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

The UK’s National Health Service is launching an AI lab – gpgmail


The UK government has announced it’s rerouting £250M (~$300M) in public funds for the country’s National Health Service (NHS) to set up an artificial intelligence lab that will work to expand the use of AI technologies within the service.

The Lab, which will sit within a new NHS unit tasked with overseeing the digitisation of the health and care system (aka: NHSX), will act as an interface for academic and industry experts, including potentially startups, encouraging research and collaboration with NHS entities (and data) — to drive health-related AI innovation and the uptake of AI-driven healthcare within the NHS. 

Last fall the then new in post health secretary, Matt Hancock, set out a tech-first vision of future healthcare provision — saying he wanted to transform NHS IT so it can accommodate “healthtech” to support “preventative, predictive and personalised care”.

In a press release announcing the AI lab, the Department of Health and Social Care suggested it would seek to tackle “some of the biggest challenges in health and care, including earlier cancer detection, new dementia treatments and more personalised care”.

Other suggested areas of focus include:

  • improving cancer screening by speeding up the results of tests, including mammograms, brain scans, eye scans and heart monitoring
  • using predictive models to better estimate future needs of beds, drugs, devices or surgeries
  • identifying which patients could be more easily treated in the community, reducing the pressure on the NHS and helping patients receive treatment closer to home
  • identifying patients most at risk of diseases such as heart disease or dementia, allowing for earlier diagnosis and cheaper, more focused, personalised prevention
  • building systems to detect people at risk of post-operative complications, infections or requiring follow-up from clinicians, improving patient safety and reducing readmission rates
  • upskilling the NHS workforce so they can use AI systems for day-to-day tasks
  • inspecting algorithms already used by the NHS to increase the standards of AI safety, making systems fairer, more robust and ensuring patient confidentiality is protected
  • automating routine admin tasks to free up clinicians so more time can be spent with patients

Google-owned UK AI specialist DeepMind has been an early mover in some of these areas — inking a partnership with a London-based NHS trust in 2015 to develop a clinical task management app called Streams that’s been rolled out to a number of NHS hospitals.

UK startup, Babylon Health, is another early mover in AI and app-based healthcare, developing a chatbot-style app for triaging primary care which it sells to the NHS. (Hancock himself is a user.)

In the case of DeepMind, the company also hoped to use the same cache of NHS data it obtained for Streams to develop an AI algorithm for earlier detection of a condition called acute kidney injury (AKI).

However the data-sharing partnership ran into trouble when concerns were raised about the legal basis for reusing patient data to develop AI. And in 2017 the UK’s data watchdog found DeepMind’s partner NHS trust had failed to obtain proper consents for the use of patients’ data.

DeepMind subsequently announced its own AI model for predicting AKI — trained on heavily skewed US patient data. It has also inked some AI research partnerships involving NHS patient data — such as this one with Moorfields Eye Hospital, aiming to build AIs to speed up predictions of degenerative eye conditions.

But an independent panel of reviewers engaged to interrogate DeepMind’s health app business raised early concerns about monopoly risks attached to NHS contracts that lock trusts to using its infrastructure for delivering digital healthcare.

Where healthcare AIs are concerned, representative clinical data is the real goldmine — and it’s the NHS that owns that.

So, provided NHSX properly manages the delivery infrastructure for future digital healthcare — to ensure systems adhere to open standards, and no single platform giant is allowed to lock others out — Hancock’s plan to open up NHS IT to the next wave of health-tech could deliver a transformative and healthy market for AI innovative that benefits startups and patients alike.

Commenting on the launch of NHSX in a statement, Hancock said: “We are on the cusp of a huge health tech revolution that could transform patient experience by making the NHS a truly predictive, preventive and personalised health and care service.

“I am determined to bring the benefits of technology to patients and staff, so the impact of our NHS Long Term Plan and this immediate, multimillion pound cash injection are felt by all. It’s part of our mission to make the NHS the best it can be.

“The experts tell us that because of our NHS and our tech talent, the UK could be the world leader in these advances in healthcare, so I’m determined to give the NHS the chance to be the world leader in saving lives through artificial intelligence and genomics.”

Simon Stevens, CEO of NHS England, added: “Carefully targeted AI is now ready for practical application in health services, and the investment announced today is another step in the right direction to help the NHS become a world leader in using these important technologies.

“In the first instance it should help personalise NHS screening and treatments for cancer, eye disease and a range of other conditions, as well as freeing up staff time, and our new NHS AI Lab will ensure the benefits of NHS data and innovation are fully harnessed for patients in this country.”


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something