Facebook’s Data About You Goes Deeper Than Your Therapist’s Notes
In 2018, Professor David Carroll of The New School set out to gain access to the data Facebook has on him. His efforts were showcased in the Netflix documentary, The big hack and helped expose the Facebook-Cambridge Analytica data scandal. To this day, Carroll has yet to receive the full copy of the data Facebook keeps on him.
At a time when an unprecedented 61% of young adults in the United States feel “severe loneliness,” according to Harvard researchers, it’s not too hard to agree that building communities, both online and offline. line, is a socially valuable business. Facebook and other social media companies started with this intention and created a space where people could easily connect. Explaining the origins of Facebook on Y Combinator, founder Mark Zuckerberg once said that he had “always been fascinated by people” and wanted to build a space where people can connect and get to know each other better.
About two months after Facebook launched in 2004, the founder said Harvard Crimson: “Maybe it would be nice to have ads to offset the cost of the servers.” What started out as the company’s attempt to stay afloat quickly grew into a single ad market.
At the official launch of Facebook ads in 2007, Zuckerberg said in a statement that his advertising model “is no longer just about messages delivered by businesses, but more and more information shared among friends. So we decided to use these social actions to create a new type of advertising system.
This system of “social ads” has been the key to making Facebook a trillion dollar business today. Their success was fueled by users’ social behaviors and relied on Facebook’s ability to successfully guide or change that behavior.
If you want to change someone’s behavior, you would want to know as much information as possible about the person and the people related to them. Then you can push them with the things they might like and encourage them to repeat those behaviors until it becomes a habit. Does this sound familiar to you?
These types of behavior improvement techniques have long been well-known tools as cognitive behavioral therapy and have been used by therapists for decades to help people build healthy habits. Social media companies have used the same techniques to change user behavior.
A key difference is that, unlike a therapist who is bound by laws like the Health Insurance Portability and Accountability Act (HIPAA) and who is clinically trained to only make suggestions that will improve the health of their clients, companies in social media have no incentives or regulations whatsoever. act in the best interests of the health of their users.
It’s fairer to compare Facebook to an unlicensed therapist whose advice is followed by billions of people around the world.
How do your therapist’s grades compare to the data Facebook has about you? While the therapist has relevant information about the challenges presented by the client, Facebook’s notes about you go much deeper than that and are much more specific.
We’ve known since 2015 that Facebook is able to judge your personality better than your friends or family, according to Time magazine, even better than you can analyze yourself. Facebook and other social media platforms also contain information about your health. This includes information from self-disclosure about mental health issues, which is a popular trend.
However, self-disclosure is not necessary for Facebook to know what you are going through. The majority of health information is derived from your online behavior. Recent studies have shown that it is possible to derive risk factors for depression, eating disorders and more by analyzing media and language content shared by users on social networks. In addition, this judgment was found to be correct up to three months before a clinical diagnosis.
Social media companies have accumulated sensitive health information about users. We can do a lot more about this, even with existing technologies and regulations.
Users can use the data they’ve already shared online to gain insight into their personalities and understand mental health risk profiles using services like My Business, Aware. Aware applies advanced technologies such as machine learning and natural language processing, similar to Facebook, to analyze user profiles with their permission and provide a detailed report that includes the mental health risk factors that a user should to be aware. This technology immediately allows people to derive useful health information from their social media data.
Social media companies
Various teams within Facebook are responsible for the security and authenticity of information. Social media companies should hire more mental health professionals to understand how platforms influence mental health and also to provide useful tools to users in need.
They should also provide adequate clinical training to employees working closely with their RNs. Recent job postings like the opening of Facebook for a quantitative health researcher are certainly a positive step in this direction.
It’s reasonable to argue that by amassing sensitive health information about users and knowing that their products influence mental health, social media companies have crossed the line to become a healthcare delivery system. Facebook’s claim that it tries to maximize good and minimize harm puts the company even more in the position of a medical professional.
For this reason, lawmakers should consider applying regulations such as HIPAA to social media companies. Enforcement of HIPAA would put in place the incentives and accountability for Facebook and other social media companies to act in the best interests of users’ health.
Param Kulkarni is co-founder of Aware. He wrote this column for The Dallas Morning News.
Find it Full review section here. Do you have an opinion on this problem? Send a letter to an editor and you might just get published.