BBC: TikTok sued for billions over use of children’s data

TikTok is facing a legal challenge from former children’s commissioner for England Anne Longfield over how it collects and uses children’s data.

The claim is being filed on behalf of millions of children in the UK and EU who have used the hugely popular video-sharing app.

If successful, the children affected could each be owed thousands of pounds.

TikTok said the case was without merit and it would fight it.

‘Sinister’

Lawyers will allege that TikTok takes children’s personal information, including phone numbers, videos, exact location and biometric data, without sufficient warning, transparency or the necessary consent required by law, and without children or parents knowing what is being done with that information.

In response, the video-sharing app said: “Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to help protect all users, and our teenage users in particular. We believe the claims lack merit and intend to vigorously defend the action.”

TikTok has more than 800 million users worldwide and parent firm ByteDance made billions in profits last year, with the vast majority of that coming via advertising revenue.

The claim is being launched on behalf of all children who have used TikTok since 25 May 2018, regardless of whether they have an account or their privacy settings. Children not wishing to be represented can opt out.

Ms Longfield told the BBC she was focusing on TikTok because, while all social media platforms collected information, TikTok had “excessive” data collection policies.

“TikTok is a hugely popular social media platform that has helped children keep in touch with their friends during an incredibly difficult year. However, behind the fun songs, dance challenges and lip-sync trends lies something far more sinister.”

She alleges the firm is “a data collection service that is thinly veiled as a social network” which has “deliberately and successfully deceived parents”.

She added that those parents have a “right to know” what private information is being collected via TikTok’s “shadowy data collection practices”.

The case is being represented by law firm Scott and Scott. Partner Tom Southwell said he believed the information collected by TikTok represents “a severe breach of UK and EU data protection law”.

“TikTok and ByteDance’s advertising revenue is built on the personal information of its users, including children. Profiting from this information without fulfilling its legal obligations, and its moral duty to protect children online, is unacceptable.”

Age verification

The case is not without precedent.

In 2019, the Chinese firm was given a record $5.7m fine by the Federal Trade Commission (FTC), for mishandling children’s data.

The firm has been fined in South Korea over how it collects children’s data, and in the UK, it has been investigated by the Information Commissioner’s Office.

That action revolved around Musical.ly, which was incorporated into TikTok, knowingly hosting content published by users under the age of 13.

TikTok was ordered to delete the data and set up an age verification system.

According to Ofcom, 44% of eight to 12-year-olds in the UK use TikTok, despite its policies forbidding under-13s on the platform.

Class action

The legal action against TikTok was first brought by an anonymous 12-year-old girl last year, supported by Ms Longfield.

At the time, Ms Longfield said she was waiting to see the result of another case before proceeding with suing TikTok.

The case in question was brought by Which? director Richard Lloyd on behalf of four million iPhone users who, he alleges, were illegally tracked by Google.

Despite being launched in 2017, the case has still not had the go-ahead and is due to be heard by the Supreme Court soon.

“It could be difficult for similar cases to succeed if the Supreme Court dismisses Mr Lloyd’s ability to bring his claim,” said Richard Leedham, partner at law firm Mishcon de Reya.


Source: BBC

Link: https://www.bbc.com/news/technology-56815480

The Guardian: Case launched against TikTok over collection of children’s data

Former children’s commissioner for England launches case against video-sharing app

A former children’s commissioner for England has launched a “landmark case” against the video-sharing app TikTok, alleging that it illegally collects the personal information of its child users.

Anne Longfield, who held the commissioner post between March 2015 and February this year, has lodged a claim in the high court on behalf of millions of children in the UK and the European Economic Area who have used TikTok since 25 March 2018.

She alleges the app is breaching UK and EU children’s data protection law and aims to stop it processing the information of millions of children, make it delete all such existing data and pay compensation she believes could run into billions of pounds.

Despite a minimum age requirement of 13, Ofcom found last year that 42% of UK eight to 12-year-olds used TikTok. As with other social media companies such as Facebook, there have long been concerns about data collection and the UK’s Information Commissioner’s Office is investigating TikTok’s handling of children’s personal information.

Longfield said: “We’re not trying to say that it’s not fun. Families like it. It’s been something that’s been really important over lockdown, it’s helped people keep in touch, they’ve had lots of enjoyment. But my view is that the price to pay for that shouldn’t be there – for their personal information to be illegally collected en masse, and passed on to others, most probably for financial gain, without them even knowing about it.

“And the excessive nature of that collection is something which drove us to [challenge] TikTok rather than others. It’s the fact that, for this [age] group of children it is the app of choice but also it’s the kind of information they’re collecting – it can’t possibly be appropriate for a video app, especially exact location, and probably face recognition as well.”

The legal claim alleges that TikTok takes children’s personal information without sufficient warning, transparency or the necessary consent required by law, and without parents and children knowing what is being done with their private information. Longfield believes more than 3.5 million children in the UK alone could have been affected.

TikTok’s privacy policy states that it collects information “you share with us from third-party social network providers, and technical and behavioural information about your use of the platform”. It says it also collects information from the user’s phone book if access is granted. Information may be shared with service providers and business partners for purposes including advertising and marketing, according to the policy.

Longfield, who is bringing the case as a representative action for those who claim to have suffered harm, said TikTok’s business model with respect to personal data was “disproportionate”, adding: “Kids can’t give consent.” She believed the case could be a landmark in establishing a framework for social media companies’ responsibilities towards children and families.

A TikTok spokesperson said: “Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to help protect all users, and our teenage users in particular. We believe the claims lack merit and intend to vigorously defend the action.”

In February last year, ByteDance, the Chinese company legally domiciled in the Cayman Islands that owns TikTok, was fined a record £4.2m ($5.7m) in the US for illegally collecting personal information from children under 13.

Tom Southwell, a partner at Scott + Scott, which is acting for Longfield, said: “TikTok and ByteDance’s advertising revenue is built on the personal information of its users, including children. Profiting from this information without fulfilling its legal obligations, and its moral duty to protect children online, is unacceptable.”

ByteDance was approached for comment by the Guardian but did not respond.


Author: Haroon Siddique

Source: The Guardian

Link: https://www.theguardian.com/technology/2021/apr/21/case-launched-against-tiktok-over-collection-of-childrens-data

SkyNews: Social media giant TikTok sued by former children’s commissioner over ‘shady’ data collection policy

Anna Longfield is calling on TikTok to protect under-13s who use the app. Damages in the case could run into billions of pounds.

The former children’s commissioner for England has launched legal proceedings on behalf of 3.5 million children under 13 against TikTok.

Anne Longfield has alleged that the social media platform has illegally collected personal data from millions of children since May 2018 – when General Data Protection Regulation (GDPR) was introduced.

The lawsuit is seeking compensation for millions of potentially affected children, which Ms Longfield said could run into billions of pounds.

The claim argues that TikTok, which was founded by Chinese Company ByteDance, breached data protection rules wilfully, taking children’s personal information without warning, transparency or the necessary consent.

It is also alleged that personal data was collected without the knowledge of parents and children.

This is the latest development in the lawsuit against the video-sharing app after the High Court ruled in December that a 12-year-old girl, who was supported by Ms Longfield, could bring the dispute with TikTok anonymously.

TikTok policies in the UK do not allow children under 13 to use the app and those downloading it are asked to input their age when they join.

Figures suggest that many under-13s use the platform.

Ms Longfield said she felt the app’s data collection policies, in general, were “excessive for a video-sharing app” but was most troubled by the “collection of data on an industrial scale without either the kids or the parents realising”.

TikTok’s data collection policy is listed on its website, but Ms Longfield said she felt its practices were “hidden” and “shady”.

“In terms of what they take there are addresses, names, date of birth information, their likes, their interests, who they follow, their habits – all of these – the profiling stuff, but also the exact geolocation, that is very much outside what would be deemed appropriate,” she said.

“You shouldn’t be doing that when it’s kids.”

Ms Longfield has accused TikTok of being “deliberately opaque” about who has access to data, but notes the company makes billions from advertising revenue generated by providing user information to advertisers.

A TikTok spokesperson said: “Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to help protect all users and our teenage users in particular.

“We believe the claims lack merit and intend to vigorously defend the action.”

Ms Longfield, who has instructed US litigation specialists Scott+Scott, hopes it would be a “powerful test case” which would be a “wake-up call” for other social media platforms.

She added that she hoped to force TikTok to delete the data and put new measures in place to protect children.

“I’d like to see them acknowledge the problem, stop collecting the illegal data, delete the illegal data they have and put safeguards in place, so they can demonstrate that they’re acting responsibly,” she said.

“I’d like to see them reassure parents – they have introduced some measures over recent months – great, I’m pleased when people take action, but while this is absolutely at the core of what the business model is, any action won’t get to the heart of what needs to be done.

“So I think they need to communicate that to parents, they need to stop doing it, they need to delete it and put measures in place and then look at how they’re going to rebuild trust – I think that really is what we’re talking about.”

TikTok is one of the world’s most popular apps – especially among youngsters — and has around 100 million users in Europe alone.

The COVID-19 pandemic, with many children having online learning at home, has helped cement its success.

In January, TikTok tightened privacy rules to protect under-16s, with any accounts for those under the age of 16 changed to private.


Author: Amar Mehta

Source: SkyNews

Link: https://news.sky.com/story/social-media-giant-tiktok-sued-by-former-childrens-commissioner-over-shady-data-collection-policy-12282220

TikTok faces billion-pound legal claim for illegally collecting children’s private information in the UK and Europe

Anne Longfield OBE, the former Children’s Commissioner for England, is bringing the legal claim on behalf of millions of children in the UK and Europex

  • Legal claim launched against TikTok and parent company ByteDance for illegally collecting children’s personal information while using the app

  • Anne Longfield OBE is fighting on behalf of parents to lift the veil on TikTok’s collection of, and profit from, millions of children’s private information
  • Legal claim aims to stop TikTok illegally processing millions of children’s information, and demands that all children’s personal information is deleted

Wednesday 21 April 2021 – London – Today, a legal claim has been launched by Anne Longfield OBE, the former Children’s Commissioner for England, against the popular video-sharing app TikTok and parent company ByteDance for deliberately violating UK and EU children’s data protection law.

Anne Longfield OBE has brought the legal claim on behalf of millions of children in the UK and Europe. Every child that has used TikTok since 25 May 2018, regardless of whether they have a TikTok account or what their privacy settings are, may have had their private personal information illegally collected by Byte Dance through TikTok for the benefit of unknown third parties.

The wealth of children’s private information processed, allegedly illegally, by TikTok has prompted concerns over what the app is doing with this information. The personal information allegedly collected by TikTok and ByteDance includes children’s telephone numbers, videos, pictures, and their exact location, along with biometric (or facial recognition) data.

The legal claim argues that TikTok takes children’s personal information without sufficient warning, transparency or the necessary consent required by law, and without parents and children knowing what is being done with their private information.

TikTok is deliberately opaque about who has access to children’s private information, which is incredibly valuable to the company. Its parent company, Cayman Islands-based ByteDance, is expected to make nearly $30 billion in 2020, with over two thirds of this being advertising revenue involving the transfer of personal information.

Anne Longfield OBE, along with experienced law firm Scott + Scott, is fighting on behalf of children and parents to stop TikTok illegally processing millions of children’s information, and demanding that the company deletes all their children’s personal information. The claim also aims to win compensation for the millions of affected children, which could be thousands of pounds per child. The damages owed by TikTok if the claim is successful may be in the billions of pounds.

Anne Longfield OBE, the claimant’s litigation friend against TikTok said:

“TikTok is a hugely popular social media platform that has helped children keep in touch with their friends during an incredibly difficult year. However, behind the fun songs, dance challenges and lip-sync trends lies something far more sinister.

TikTok is a data collection service that is thinly-veiled as a social network. It has deliberately and successfully deceived parents, whose best intentions are to protect their children and children themselves.

 Parents and children have a right to know that private information, including phone numbers, physical location, and videos of their children are being illegally collected. TikTok appears set on making it as difficult as possible for millions of mothers and fathers to know who is benefiting from this information.   

We want to put a stop to TikTok’s shadowy data collection practices, and demand that they delete all private information that has been illegally processed when children use the app.”  

Tom Southwell, Partner at the law firm Scott + Scott, commented:

“The information collected by TikTok represents a severe breach of UK and EU data protection law. Children do not understand how exposed they are when they use the app, and parents have been deliberately left in the dark by TikTok.

 TikTok and ByteDance’s advertising revenue is built on the personal information of its users, including children. Profiting from this information without fulfilling its legal obligations, and its moral duty to protect children online, is unacceptable.

We hope that TikTok gives serious consideration to the gravity of the concerns of millions of parents and takes considerable steps to improve their practices in light of the issues raised by the case”.

Further information on the legal claim

TikTok is a popular short-video app owned by its Cayman Islands-based parent company, ByteDance, with 800 million users worldwide.

The legal claim has been brought on behalf of millions of children using TikTok in the UK and European Economic Area who have been impacted by the app’s actions. Research conducted in support of the legal claim estimates that over 3.5 million children are affected in the UK alone.

The claim alleges that TikTok and ByteDance have violated the UK Data Protection Act and the EU General Data Protection Regulation (GDPR), namely articles 5, 12, 14, 17, 25, 35 and 44 of the GDPR.

TikTok and ByteDance have displayed a troubling pattern of breaking child data protection laws. In 2019, TikTok was issued a record fine for a case involving child data in the United States. This was followed by similar penalties in South Korea in 2020.

TikTok has subsequently implemented measures for its users in the United States to verify their age when they open the App. Despite this, TikTok has refrained from introducing a similar age verification policy in the UK or other European countries.

Concerns have also been raised among by UK MPs about alleged information sharing between TikTok users in the UK and ByteDance, which could be subject to China’s National Intelligence Law.

Further information on the claim can be found at http://tiktokdataclaim.uk/.

Law360: TikTok Faces Suit Over Child Privacy From London Preteen

Law360, London (January 4, 2021, 3:43 PM GMT) — A judge has ruled that a 12-year-old girl can remain anonymous in her pursuit of claims that the video-sharing app TikTok is illegally exploiting the personal data of children.

Judge Mark Warby said in a Dec. 30 ruling at the High Court that he would permit the case to go forward, with the claimant being identified only as a girl of 12 from London.

“Disclosure of that information is a lesser measure than total elimination of all personal information other than her age, and one that does not create a material risk of the harms identified,” the judge wrote.

The litigation, backed by Anne Longfield, the Children’s Commissioner for England, targets TikTok Inc. and five related companies, including its “effective predecessor” Musical.ly.

Acting as a “litigation friend” on behalf of the girl, Longfield told the court that the lawsuit seeks to bring a representative action on behalf of all children under 16 years of age who are or were TikTok users.

The suit claims that TikTok has misused the 12-year-old girl’s private information for advertising purposes, in violation of duties imposed by the European Union’s General Data Protection Regulation rules and the U.K.’s corresponding legislation.

In addition to unspecified damages for “loss of control of personal data,” the suit seeks to have the girl’s personal information erased from the TikTok platform.

Judge Warby noted in his ruling that the lawsuit against TikTok is “clearly inspired” by a collective lawsuit in the U.K. accusing Google LLC of tracking the personal data of iPhone users.

That case is due to come before Britain’s Supreme Court, which agreed to hear Google’s challenge to an appellate court finding that the iPhone users had sufficient common ground to be considered a representative group.

“Having read the papers, it was clear that those representing the claimant do not wish to press on with the case until the outcome of the appeal in Lloyd v. Google is known,” Judge Warby said. “But they were keen to issue the claim before the year end.”

The urgency stems from the fact that the end of the Brexit transition, at the close of 2020, brought about changes in law — specifically GDPR — which the girl’s lawyers said would affect her claim.

According to the girl’s counsel, the court will maintain jurisdiction over the GDPR claims as the case was filed before the end of the year.

That jurisdiction would be “less clear” if the claim had been filed in 2021, the girl’s counsel said in court filings, and could “prejudice the ability of the claimant to bring the claim and/or defend any jurisdictional challenge” brought by the defendants.

Her lawyers also told the court that they needed to file before the end of 2020 because that would make it easy to enforce any judgment given in EU member states without further procedures.

In his ruling, Judge Warby said he expected that the case would attract significant attention, some of which would be focused on the girl if her identity was known.

What was persuasive, he said, was the evidence of Longfield and her office that there was a real risk of direct online bullying by other children or users of the TikTok app, as well as “negative or hostile reactions from social media influencers who might feel their status or earnings were under threat.”

The judge also gave significant weight to her argument that if the court required her to be named, it could have a “chilling effect” on claims brought on behalf of children to vindicate their data protection rights.

Other companies named as co-defendants in the litigation include Tiktok Information Technologies Ltd., Tiktok Technology Ltd., Bytedance Ltd., Beijing Bytedance Technology Company Ltd. and Musical.Ly.

The defendants were not present or represented in court Wednesday.

In its terms of service, TikTok states that it is a platform only for people 13 and older. Prospective users entering a birth date indicating they are younger than that age are not allowed to register and are blocked from trying again.

“Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to protect all users, and our younger users in particular,” a TikTok spokesperson told Law360.

Representatives for the girl and Longfield did not respond to requests for comment.

The claimant is represented by Charles Ciumei QC and Helen Morton of Essex Court Chambers, instructed by Scott + Scott UK LLP.

Counsel information for TikTok and the other defendants was not available Monday.

The case is SMO v. TikTok Inc. and others, case number QB-2020-004576, in the High Court of Justice of England and Wales, Queen’s Bench Division, Media and Communications List.

– – –

Author: Bonnie Eslinger

Source: Law360

Link: https://www.law360.com/articles/1341121

Business Insider: TikTok is facing legal action from a 12-year-old girl over the way it handles children’s data

TikTok is facing the prospect of legal action from a 12-year-old English girl over whether the way it handles children’s data violates European Union and UK privacy laws.

The girl, who on Wednesday won the right to remain anonymous should she bring a case against the shortform-video company, is being supported by England’s children’s commissioner, Anne Longfield.

Per the BBC, Longfield told the High Court in London she hoped a case would result in TikTok being ordered to delete the plaintiff’s data, thereby setting a precedent. Longfield, who is bringing the claim on behalf of the child, is due to leave her post as children’s commissioner in May.

Justice Mark Warby granted anonymity to the plaintiff on the grounds that if her identity were revealed she might be cyberbullied by peers or even harassed by social-media influencers “who might feel their status or earnings were under threat,” according to a High Court ruling published late last month.

In his description of the planned suit, Warby said it “involves serious criticisms of what may be key aspects of the platform’s mode of operation.”

“Privacy and safety are top priorities for TikTok, and we have robust policies, processes, and technologies in place to protect all users and our younger users in particular,” a TikTok representative told Business Insider. “As this application was made without notice, we first became aware of the application and the High Court’s judgment when it was filed and are currently considering its implications.”

Technically under-13s are not supposed to be able to hold TikTok accounts, per the app’s terms and conditions. This isn’t the first time the app has come under scrutiny over the way it protects children on its platform.

In February 2019, TikTok agreed to pay a $5.7 million fine to settle allegations from the US Federal Trade Commission that the Musical.ly app, as TikTok was previously known, illegally collected the personal data of under-13s. In May of last year, a group of 20 advocacy groups accused TikTok of violating its 2019 settlement with the FTC, saying it still contained data relating to account holders under the age of 13.

Author: Isobel Asher Hamilton 

Source: Business Insider

Link: https://www.businessinsider.com/tiktok-legal-action-12-year-old-girl-childrens-data-2021-1?fbclid=IwAR2O7_LQjcpUsXsraVkxzvAP7ea_TZzrFuWVJsAAJZnJWPcF8RuPj1HS1ZI

BBC: TikTok faces legal action from 12-year-old girl in England

A 12-year-old girl is hoping to take legal action against video-sharing app TikTok, claiming the company uses children’s data unlawfully.

A court has ruled the girl can remain anonymous if the case goes ahead.

The action is being supported by Anne Longfield, the children’s commissioner for England. She believes TikTok has broken UK and EU data protection laws.

TikTok said it had “robust policies” in place to protect children and did not allow under-13s to join.

Ms Longfield hopes the case will lead to greater protective measures for under-16s who use TikTok in England and possibly beyond.

She believes the app collects and processes children’s data to power its video-recommendation algorithm, to capture viewers’ attention and generate advertising revenue.

The commissioner told the High Court in London – via a video link – that she hoped it to would ultimately issue an order forcing the firm to delete the child’s data, setting a precedent.

But the focus of the preliminary hearing was to decide whether the 12-year-old girl could make a claim anonymously.

Mr Justice Warby judged that the girl risked being cyber-bullied by other children and TikTok users if her identity was revealed.

He said she could face “hostile reactions from social media influencers who might feel their status or earnings were under threat”.

Ms Longfield is waiting for the conclusion of a data protection case against Google before deciding whether to sue TikTok.

In 2019, TikTok was fined $5.7m (£4.2m) by the US Federal Trade Commission for its handling of children’s data.

South Korea issued a fine for similar reasons in 2020.

In a statement, TikTok said: “Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to protect all users, and our younger users in particular. As this application was made without notice, we first became aware of the application and the High Court’s judgment [on Wednesday] and are currently considering its implications.”

The app’s terms and conditions state the service is not available to under-13s and all users are asked their age when signing up.

It actively reviews and removes accounts that appear to be used by under-13s.

– – –

Source: BBC

Link: https://www.bbc.com/news/technology-55497350

Pre-action application for anonymity for the child representative approved

While we cannot comment on the case, pending the completion of pre-action protocol, we can express how very pleased we are with the findings and the protection the court gave our client in response to our pre-action application for anonymity for the child representative. We are grateful to Mr. Justice Warby for recognizing that, in an action designed to protect the rights of children, it is imperative to protect the identity of a minor claimant.

Official Approval Document

Video Social Networking App Musical.ly Agrees to Settle FTC Allegations That it Violated Children’s Privacy Law

The operators of the video social networking app Musical.ly, now known as TikTok, have agreed to pay $5.7 million to settle Federal Trade Commission allegations that the company illegally collected personal information from children. This is the largest civil penalty ever obtained by the Commission in a children’s privacy case.

The FTC’s complaint, filed by the Department of Justice on behalf of the Commission, alleges that Musical.ly violated the Children’s Online Privacy Protection Act (COPPA), which requires that websites and online services directed to children obtain parental consent before collecting personal information from children under the age of 13.

“The operators of Musical.ly—now known as TikTok—knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” said FTC Chairman Joe Simons. “This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law.”

The Musical.ly app allowed users to create short videos lip-syncing to music and share those videos with other users. To register for the app, it required users to provide an email address, phone number, username, first and last name, a short biography, and a profile picture. Since 2014, more than 200 million users have downloaded the Musical.ly app worldwide, while 65 million accounts have been registered in the United States.

In addition to creating and sharing videos, the app allowed users to interact with other users by commenting on their videos and sending direct messages. User accounts were public by default, which meant that a child’s profile bio, username, picture, and videos could be seen by other users. While the site allowed users to change their default setting from public to private so that only approved users could follow them, users’ profile pictures and bios remained public, and users could still send them direct messages, according to the complaint. In fact, as the complaint notes, there have been public reports of adults trying to contact children via the Musical.ly app. In addition, until October 2016, the app included a feature that allowed users to view other users within a 50-mile radius of their location.

The operators of the Musical.ly app were aware that a significant percentage of users were younger than 13 and received thousands of complaints from parents that their children under 13 had created Musical.ly accounts, according to the FTC’s complaint.

The complaint alleges that the operators of the Musical.ly app violated the COPPA Rule by failing to notify parents about the app’s collection and use of personal information from users under 13, obtain parental consent before such collection and use, and delete personal information at the request of parents.

In addition to the monetary payment, the settlement also requires the app’s operators to comply with COPPA going forward and to take offline all videos made by children under the age of 13.

The Commission vote to authorize the staff to refer the complaint to the Department of Justice and to approve the proposed consent decree was 5-0. Commissioner Rohit Chopra and Commissioner Rebecca Kelly Slaughter issued a separate statement.

The DOJ filed the complaint and proposed consent decree on behalf of the Commission in the U.S. District Court for the Central District of California. NOTE: The Commission authorizes the filing of a complaint when it has “reason to believe” that the law has been or is being violated, and it appears to the Commission that a proceeding is in the public interest. Consent decrees have the force of law when approved and signed by the District Court judge.

The FTC would like to thank the Better Business Bureau’s Children’s Advertising Review Unit (CARU) for helping to bring attention to this matter.

The Federal Trade Commission works to promote competition, and protect and educate consumers. You can learn more about consumer topics and file a consumer complaint online or by calling 1-877-FTC-HELP (382-4357). Like the FTC on Facebook(link is external), follow us on Twitter(link is external), read our blogs, and subscribe to press releases for the latest FTC news and resources.

– – –

Source: United States Federal Trade Commission (FTC)

Link: https://www.ftc.gov/news-events/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ftc