Week 2 - Oversharing Online
1. Do different social networking sites offer other benefits and drawbacks?
Different social networking sites have their pros and cons. Instagram used to be solely for sharing photos, but now it has become a major advertising platform and uses algorithms that promote Reels, especially advertisement Reels. Facebook has always been a platform where you can share many photos and long posts (kind of like a blog) all on one post, now this platform is also full of advertisements and does not show all the content shared and posted by the people you follow. I have never used Twitter but I think that platform, at its inception and certainly not now, would have been good at sharing clear and concise information in under 180 characters. Now there is so much censorship across all platforms making it impossible to see content you might prefer or content from the people you follow because they are shadow-banning and hiding that content.
2. Is there an ideal number of “friends” or connections individuals have on Facebook that can improve their mental health?
I have a lot of "friends" that I never talk to and haven't seen in well over ten years. Sure, Facebook is a good way to keep up with past acquaintances and high school friends as well as your current group of friends. I do not agree that the number of connections one has on Facebook will improve mental health. This standpoint is quite biased though since I do not use the platform other than for school and hiking groups and I do not connect with anyone on Facebook.
3. What factors might influence whether Facebook has negative influences, like links with depression, versus positive results, such as boosts in self-esteem?
I think the heart/like buttons on Facebook and Instagram have negative influences when people do not receive enough it could be linked to their self-worth, although if people do receive enough "likes" then that could also result in boosts in serotonin and in self-esteem. It is a double-edged sword. If you do a quick Google search, tons of articles come up saying that Facebook and social media use is linked to mental distress, anxiety, and depression.
4. Do social networking sites have any responsibility in promoting mental health in their users? If so, how might they go about doing so?
I think if a company is going to make themselves seem like an essential thing to have to connect with other people, advertise products or a small business, and increase the ability to get information circulated instantly, then they absolutely have a responsibility in promoting mental health in their users. These online environments can become toxic from online trollers and mean-spirited people who comment on posts just to put others down. Furthermore, there are a ton of scammers on Facebook trying to "sell" concert tickets or "rent" houses. There are systems in place to try to combat this on social media platforms, but they are usually ineffective. I think that when users are trying to "report" misconduct, there should be options where that user can explain why because I feel like if the platform actually knew the reason they would be more inclined to get rid of those scammers or people not following the "community guidelines" set out by those platforms.
There are a ton of pages like this on Facebook that give you tips and tricks on how to protect yourself and your account. But if something does happen, there is no real way the offending person has to deal with any consequences. This can be detrimental to users and their mental health and it needs to be changed.
Different social networking sites have their pros and cons. Instagram used to be solely for sharing photos, but now it has become a major advertising platform and uses algorithms that promote Reels, especially advertisement Reels. Facebook has always been a platform where you can share many photos and long posts (kind of like a blog) all on one post, now this platform is also full of advertisements and does not show all the content shared and posted by the people you follow. I have never used Twitter but I think that platform, at its inception and certainly not now, would have been good at sharing clear and concise information in under 180 characters. Now there is so much censorship across all platforms making it impossible to see content you might prefer or content from the people you follow because they are shadow-banning and hiding that content.
2. Is there an ideal number of “friends” or connections individuals have on Facebook that can improve their mental health?
I have a lot of "friends" that I never talk to and haven't seen in well over ten years. Sure, Facebook is a good way to keep up with past acquaintances and high school friends as well as your current group of friends. I do not agree that the number of connections one has on Facebook will improve mental health. This standpoint is quite biased though since I do not use the platform other than for school and hiking groups and I do not connect with anyone on Facebook.
3. What factors might influence whether Facebook has negative influences, like links with depression, versus positive results, such as boosts in self-esteem?
I think the heart/like buttons on Facebook and Instagram have negative influences when people do not receive enough it could be linked to their self-worth, although if people do receive enough "likes" then that could also result in boosts in serotonin and in self-esteem. It is a double-edged sword. If you do a quick Google search, tons of articles come up saying that Facebook and social media use is linked to mental distress, anxiety, and depression.
4. Do social networking sites have any responsibility in promoting mental health in their users? If so, how might they go about doing so?
I think if a company is going to make themselves seem like an essential thing to have to connect with other people, advertise products or a small business, and increase the ability to get information circulated instantly, then they absolutely have a responsibility in promoting mental health in their users. These online environments can become toxic from online trollers and mean-spirited people who comment on posts just to put others down. Furthermore, there are a ton of scammers on Facebook trying to "sell" concert tickets or "rent" houses. There are systems in place to try to combat this on social media platforms, but they are usually ineffective. I think that when users are trying to "report" misconduct, there should be options where that user can explain why because I feel like if the platform actually knew the reason they would be more inclined to get rid of those scammers or people not following the "community guidelines" set out by those platforms.
There are a ton of pages like this on Facebook that give you tips and tricks on how to protect yourself and your account. But if something does happen, there is no real way the offending person has to deal with any consequences. This can be detrimental to users and their mental health and it needs to be changed.
Comments
Post a Comment