Is App Safe for 12-Year Olds? is a relative newcomer to the social app scene, and seems to be particularly popular with young teens and perhaps even younger kids. The way it works is that Musically Appusers can take a clip from a song or other media that they own, and then create a 15 second video with that clip as the soundtrack. Lip syncing and comedic sketches dominate the feed.

It sounds like fun, and according to an article at Business Insider had 70 million users as of last month, Is it safe for 12-year olds, or younger users? Let’s take a look.

Age Limit

For starters, the age limit is 13, or possibly 18, and that’s not a joke. Once you download the app, you are never asked your age, so maybe they don’t care all that much. According to the app’s Terms of Service:

“IF YOU ARE UNDER 13 YEARS OF AGE, YOU MUST NOT USE OR ACCESS THE SERVICE AT ANY TIME OR IN ANY MATTER. Furthermore, by using the service, you affirm that you are at least 18 years of age.”

We’re guessing that the 18-year old reference is a mistake and the app’s age limit is really intended to be 13. Incidentally, it is rated 12+ in the app store. In any case, if your child is under 13 and wants to use the app, you should know that she is breaking the rules.

Are the rules – in this case the age limit – important? If you child is under 13 they are. Under 13s are protected by the Children’s Online Privacy Protection Act, which is you and your child’s primary means of protecting her personal information. Make no mistake, does collect your personal information, and might use it, even if/after she deletes her account.

Adult Content

When it comes to whether your child will encounter adult content on the app, the Terms of Service appear to be quite strict. does not allow users to post material that is “abusive, bullying, defamatory, harassing, harmful, hateful, inaccurate, infringing, libelous, objectionable, obscene, offensive, pornographic, shocking, threatening, unlawful, violent, vulgar or promoting bigotry, discrimination, hatred, violence, or inciting violence.”

In terms of nudity, sexual and other adult content, our take is that the language above is quite vague. While obscene content and pornography specifically are prohibited, it depends on how they define it and how they monitor for it. Furthermore, one might assume that some nudity is allowed. While browsing the app this week, we didn’t see any nudity but profanity was easy to find.

Hashtags are widely used on the app, and it has a search function, but we were happy to see that searches for many adult-oriented hashtags is disabled (#boobs, #sexy etc.).

Predator Risk

As with most social apps, predator risk is an issue. If you want to keep your child safe, we recommend setting the account to private, hiding location info and only allowing direct messages from friends. All of those controls are accessible from the settings menu.

In summary, is not more unsafe than most other social apps, and it probably has less adult content than many. However, since a vast number of users appear to be girls well under the age of 18, it could over time become a favorite of predators. We would still caution parents against allowing users 12 and under to download, and use caution with older teens.

If you want to read more about what other parents have to say, there are a lot of parent reviews posted at Common Sense Media.



If you are worried that your teen or tween is at risk, we can help. The ThirdParent initial audit is now FREE (previously a $49 value). Ongoing monitoring is $15 per month and you can cancel at any time. Click here to sign up today!



Contact ThirdParent any time for help and resources for monitoring child and teen internet activity.

Work at a high school or college? We have custom solutions for monitoring dangerous or inappropriate activity. Learn more.


Follow us on Twitter or Facebook for more news and information on keeping your teens safe online. You can also sign up for our weekly newsletter below.


Is WhatsApp Safe for Teens?

If your teen has friends and a cell phone, and what teen doesn’t, she is probably using a messaging app in addition to or instead of the text messaging client that came installed on her phone. The most popular is Whatsapp-safe-for-teensWhatsApp, with 800 million users globally, although Kik Messenger and Facebook messenger may be more popular in the U.S. Is it safe for teens? That depends.

First of all, according to the app’s Terms of Service, the age limit for WhatsApp is 16, though it is largely ignored. In fact, as of October 2014, 8% of U.S. internet users aged 14 – 17 use WhatsApp, and that number is undoubtedly higher now.

Facebook acquired WhatsApp last year or a hefty $18 billion or so, which might lead one to ask why the age limit for WhatsApp (16) is higher than that of Facebook (13). We believe that the reason is that WhatsApp has more and different risks than Facebook, especially for teens. Let’s take a look at how:

Adult content – On Facebook, there are strict rules about what types of content are permitted; on WhatsApp there are few strict prohibitions (i.e. “Adult content must be identified as such” – we’re not even sure what that means). If you’re hoping that human moderators will protect your teen from inappropriate content on WhatsApp, you’re out of luck.

Predator risk – The playbook for a typical predator often follows the same pattern: find a teen on Facebook or Instagram then send a friend request. After you’re friends with the teen, attempt to establish a rapport (what is referred to as “grooming”) and keep communicating. The next step is usually to move the conversation over to a more private platform, like a messaging app.

Sexting – If your teen wants to send a risqué photo or video, he is going to find a way to do it. Since WhatsApp allows customizable picture or video transmission to any user in your address book, it is certainly an option for sexting.

Private, or maybe not – WhatsApp claims that they do not store messages sent and received on their servers, so your teen might think that once a message is sent, that’s as far as it goes. As with any messaging app, messages can be saved by the recipient and retransmitted or posted online. There is always a risk that they will be around forever, and not private at all.

No password – WhatsApp users are not required to set or use a password for the app, so if one of their friends gets their hands on the phone, and the phone is unlocked, there is a risk that a rogue message can be sent.

While there are some risks to teens using WhatsApp, the app itself is not the problem – what your teen is doing with it may be. As a parent you can start by discussing which messaging options your teen is using, how she is using them and who she is communicating with. If any of it sounds like a risk, take it from there, but by all means understand what you’re dealing with when it comes to keeping your teen safe online.


Contact ThirdParent any time for help and resources for monitoring child and teen internet activity.

Work at a high school or college? We have custom solutions for monitoring dangerous or inappropriate activity. Learn more.

Who Owns Your Teen’s Social Media Accounts?

Well, parents, you don’t. Nor do you have much control over them even though your teen is a minor in your care, control and living under your roof.

teen-cell-phoneRight now in the U.S., millions of kids are old enough to have a smartphone, but not old enough to legally or safely use social media. Well, maybe they aren’t old enough to have a smartphone either, but they have one anyway. Survey results vary, but the general consensus is that the average age at which kids get a cell phone is around 10. Nowadays, that’s probably a smartphone – a handheld computer that is immediately connected to the world via the internet and social media.

Just having a smartphone isn’t dangerous in and of itself. In fact, if used responsibly, it’s actually safer to have one than not. Parents can contact the child any time, and the child in turn can call or message someone if there’s a problem. In a nutshell, that’s the good part of being connected.

Being connected also has a downside. Once a child is communicating digitally, a whole new set of risks presents itself – predator risk, cyberbullying, general creepers and exposure to adult “stuff” which can lead to growing up too fast, among other things. Social media can be a gateway to that connection outside a tween’s immediate group of friends.

The legality thing is less a big deal. Sure kids aren’t supposed join a social network before the age of 13, but they aren’t going to get in trouble for doing so. What they do lose is the protection provided by the Children’s Online Privacy Protection Act (COPPA), which serves to protect them and their personal information from advertisers.

So what if your smartphone-wielding, digitally savvy 10-year old joins a social network without your permission? Allow us to explain.

One important consideration is that many tweens, when joining a social network for the first time, either immediately forget the password or fat finger the email address. On networks like Instagram, they don’t send a confirmation email before account activation, so an account can be established and be accessible via smartphone but the tween will have no way to change the email address, password or delete the account.

Let’s say a parent finds out about the rogue account. Perhaps there has been an incident of cyberbullying, or that the child has revealed too many personal details. That parent would naturally tell the child to delete the account. If the minor is unable to delete the account, the parent should be able to request that the social network do so. Turns out it’s not that simple.

In our opinion, if the parent of any minor, no matter the age, wants a minor’s social media account deleted, she should he able to make that happen. In the U.S., that’s not the case. If your child is between the ages of 13 and 17 – still a minor in your care – a social network is under no obligation to delete an account or even respond to your request. If your child is 13 or under, the social network is required to delete the account, but only if they have “actual knowledge” that the child is under 13.

The Federal Trade Commission (FTC), who should be kind of a big deal on issues such as these, has the following to say on the matter:

“As a parent, you have rights covering the collection of your children’s information online when they’re under 13. Learn more about COPPA — the Children’s Online Privacy Protection Act — which requires sites and services to get your approval before they collect, use or disclose your child’s information.”

If your under-13 child has joined a social network, that network has collected personal information on your child. If you contact the social network and tell them/prove that your child is under 13, they have actual knowledge. Again, it’s not that simple.

Some networks make it next to impossible for parents to contact them. Others simply ignore the requests unless there is proof, such as a birth date, in the profile of the account.

Regardless of their business model, the social networks are in the business of amassing large numbers of users. To allow parents to delete minors’ accounts would run contrary to that goal.

What we think parents deserve is the ability, upon furnishing proof of a minor’s age and the relationship, to have underage accounts deleted easily, and in a timely fashion. Until that happens, we advise parents to establish firm guidelines before handing over that first smartphone.




Contact ThirdParent any time for help and resources for monitoring child and teen internet activity.

Work at a high school or college? We have custom solutions for monitoring dangerous or inappropriate activity. Learn more.

Follow us on Twitter or Facebook for more news and information on keeping your teens safe online. You can also sign up for our weekly newsletter below.