
Algorithms are incredible tools. When run by computers, they can fly airplanes without human pilots. They can figure out how much insulin a teen with Type I diabetes needs minute to minute. They even decide what social media shows you — and billions of other people.
Your brain creates algorithms, too. It uses them to decide where to stand on the soccer field when you’re playing defense. Or map the best bike route to school. Or choose between multiple-choice options on a test.
“An algorithm is just a fairly precise list of instructions to accomplish a specified task,” says Noah Giansiracusa. He’s a mathematician at Bentley University in Waltham, Mass.
Algorithms have been around since the beginning of human history. Hammurabi’s Code, for example, is a list of 282 rules from almost 4,000 years ago. A Babylonian king named Hammurabi laid out these rules for his subjects. They included things like the punishment for stealing an ox. (You had to pay back 30 times its value.)
Those rules are perhaps the first “algorithm for justice,” says Giansiracusa. Today, people use algorithms to design anything from friendship bracelets to skyscrapers.

But when people use the word “algorithm” these days, they’re often referring to algorithms online. These can have a huge influence on us. Large companies have created computer algorithms that decide what users see on social media and other websites.
Online algorithms can help you find the content you want more quickly and reliably. But they can also treat you like a preschooler, assuming they know what you want better than you do. Algorithms may try to push you into doing what their creators want, even if it’s far from what you want.
You are not powerless in this, Giansiracusa says. He describes why in his book, Robin Hood Math: Take Control of the Algorithms That Run Your Life. Algorithms are often at work behind the scenes online. But you can learn how they work and how they often try to influence you.
“If you know about algorithms,” says Giansiracusa, “you can tailor them for you.” That way, he says, they can actually help you — not just some big company.

Many of the algorithms behind social media and other sites work like recipes. When using a recipe, you follow a list of steps that combine ingredients (inputs) to make a desired dish (output). In cooking, some ingredients are more important than others. You need flour to make muffins, for instance. But raisins or nuts are optional.
The mathematical algorithms that computers use to make decisions work a similar way. They also have inputs of varying importance or “weight.” More important inputs have larger weights. Less important inputs have smaller weights.
Imagine you want to get a dog, but you don’t know what type. You can make an algorithm to help you decide.

“Look for websites that rank dogs … on a range of factors,” Giansiracusa says. Factors could be how good the dog is with kids, how much exercise it will need or how good a guard dog it is. Those factors are your inputs. “Decide how much each of the factors matters to you,” Giansiracusa says. If you think each factor is equally important, you’d give each a weight of 1.
Now consider different dogs you might adopt. Maybe a golden retriever, a Yorkshire terrier or a St. Bernard. Rank them on each factor, where 3 is the best and 1 the worst. A golden retriever might get a 3 for being best with kids, while the Yorkie gets a 2 and a St. Bernard a 1. Now rank each dog on guard-dog skills and exercise needs.
Multiply each dog’s good-with-kids rank times the weight you chose for that factor. Do the same for guard-dog skills and exercise needs. Then add up the three products in each row to get one score for each dog. (In math, this is called a “weighted sum.”)
When the weights are all 1, the St. Bernard gets a score of 7: 1×1=1 (for goodness with kids) plus 3×1=3 (for guard dog) plus 3×1=3 (for exercise). It scores highest! But maybe you decide kid-friendliness is more important and raise that weight to 2. Now the retriever scores a 9: 3×2=6 (for kids) plus 2×1=2 (for guard dog) plus 1×1=1 (for exercise). “If we put a little more weight on safety with kids, the golden retriever jumps into the lead,” Giansiracusa points out.
Pretty doable, right? This is the model for many algorithms.
Algorithms have exploded online. They determine what appears on your social media feeds, including on Instagram, TikTok and Snapchat. They also power sites that recommend content, such as Netflix, YouTube and Roblox. The content that algorithms decide to show users could be words, videos, photos, music, video games, movies, TV shows or products.
Algorithms can be amazingly helpful. Before we can blink, they sift through enormous piles of information much too large for a human to handle. Then, they give us the info we seek — or that would likely interest us. As in our dog-choosing algorithm, online systems do this (at least in part) using weighted sums.

The inputs for online algorithms are probabilities, Giansiracusa explains. Each is the probability or likelihood that you will engage with a post in a certain way. The probability you’ll share a post is called Pshare, which might be different from the likelihood that you’ll comment on it, Pcomment.
All the probabilities are numbers that the algorithm estimates. How? It analyzes data. What data? Your data.
“Algorithms track what you search for, click on, watch, follow, like, share, comment on — and even how long you spend viewing certain posts,” explains Myojung Chung. She’s an expert on journalism and media innovation at Northeastern University in Boston, Mass.
“Social media algorithms are shaped by nearly every aspect of your online behavior,” Chung says. “This begins with your basic demographic data — such as your age, location, gender and racial or ethnic identity.”
Armed with those data about you, a social media algorithm calculates how likely you are to engage with some particular post. As with selecting a dog, it then multiplies each probability by its weight and sums them up. This produces a single number, or score, for each post. The top posts in your feed will be the ones having the highest scores. The algorithm puts lower-scoring posts further down in your feed.
Why this focus on engagement? It has to do with the algorithm’s goal.
We users may want social media to show us what our friends have been up to or serve us the latest funny memes. The company behind a social media app has its own goal: what it wants to get from you as you use its app. These two goals may be different — maybe even directly in conflict with each other.
Is the goal of social media platforms to entertain, inform and satisfy us? Not really.
Companies are trying to make money from our time on their platforms. They give us the apps for free — then get money from ads. Other companies (for example, Nike or H&M) typically pay social media companies to have clickable ads for their products show up next to — or even within — our feeds.
Product makers will pay more to show their ads to users who are more likely to click on them. And “the more time you spend on the app, the more ads you will see,” Giansiracusa explains — so the more likely you are to eventually click one.
The bottom line is that social media companies make more money when you stay on their platforms longer. The goal of their algorithms becomes keeping you on their sites as long as possible.
A user’s goal is different. “I never log into TikTok saying, ‘My goal today is to spend as many minutes as possible on this,’” jokes Giansiracusa. Instead, his goal on LinkedIn might be “to make valuable connections. On TikTok, it’s to be entertained.” On Facebook, he says, “I want to see family pictures.”
As a result, users and app owners frequently find themselves in a tug of war. As users, we try to make our own decisions, while social media companies try to reel us in by giving us posts we are likely to click on. This can affect us negatively — beyond just being online longer than we had planned.
Social media apps serve up many posts similar to what we have clicked on in the past. They also feed us posts that people similar to us have clicked on. But this can cause our horizons to narrow instead of widen. We see more of the kinds of things that we’ve seen before. And we may feel pressure to like what others do.
Plus, any content that is shocking or causes strong emotions gets clicked on a lot. So algorithms tend to push that content on users. Such posts can easily be fake, however. Social media is known to contain lots of misleading and false information.
Remember that the algorithms are not trying to affect you in any way, good or bad. They’re just trying to determine what posts will keep you clicking.
An online algorithm “doesn’t care if I’m entertained or if I’m nauseated,” Giansiracusa says. It couldn’t care less, he adds, if I’m just scrolling and “feel horrible but I can’t turn away.” It’s not human. It doesn’t understand your wants or needs. It has no values or morality. All it does is calculate which posts are most likely to get your attention, then feed them to you, Giansiracusa says. “That’s all!”
We all need to have a say in what see. But when you’re stuck in a scroll, it can feel impossible to get offline, or even to influence what your feed shows you.
By altering your online behavior, you can take back control, Giansiracusa notes. Specifically, he says: “You can influence the Ps — the estimates of your engagement probabilities.”
It isn’t easy. For starters, each platform decides how much weight to assign Pshare, Pcomment and other probabilities. And remember from the dog example, the weights of each input can greatly change what the algorithm recommends.
What’s more, “we don’t know what’s in the algorithms or exactly how they’re weighting [their inputs],” Robbie Torney points out. He’s senior director of AI programs at Common Sense Media. This organization offers tips on how to improve the online experiences of children and teens.
Finally, social media algorithms “are always changing,” Giansiracusa says — “changing without our knowledge.”
Kyle Chayka sums it up in his book, Filterworld: How Algorithms Flattened Culture. On any given day, he notes, users cannot fully understand what went into an algorithm’s recommendations. Why? “Their equations, variables and weights are not public. … They are closely held trade secrets, almost like nuclear codes.”
But don’t despair. Here’s how app users can wrest back some control over algorithms.
Submit your question here, and we might answer it an upcoming issue of Science News Explores
Mold your feeds. Don’t just accept what the algorithm is serving you. Ask yourself: “Is this something that I intentionally want to see? Or is this just what social media is putting in my feed?” says Jessica S., 17, of Boca Raton, Fla. Jessica is a youth leader for GoodForMEdia.org. This group describes itself as “a youth-led, peer-mentoring and education program exploring the ME in media.”
Follow “accounts that inspire or educate you,” suggests Nandini V. And “unfollow or mute accounts that have more negativity.” She is a 16-year-old in Freemont, Calif., who is on the Teen Advisory Board at #HalfTheStory. This youth-led group works to advance digital wellness.
Use large platforms sparingly. Try not to spend most of your screen time on big social media sites. Use Instagram and the other big apps to learn about things and people you’re interested in, Chayka recommends. Then “move with [those content creators] to a platform that’s less exploitative” and more individual.
For instance, you could subscribe to podcasts you find interesting. Or find specific YouTube channels that review music you enjoy. You could even sign up to get newsletters from favorite creators on platforms like Substack. Once you find writers or artists you like, you can enjoy your favorite content on those sorts of sites directly, without the social media feed.
Give feedback carefully. If you don’t like something, give NO feedback. Don’t like it. Don’t comment on it. Don’t rewatch it. Just ignore it. As Giansiracusa explains, social media algorithms tend to see any feedback (even negative comments) as a statement of interest — and will send more posts like it your way.
Practice identifying fake news. Chung recommends doing this with a friend. Each of you come up with a few real videos (or photos) and an equal number of fake news or AI-generated ones. Present a pair of videos or pics to your friend and have them try to figure out which is real and which is fake. Talk about whether they were right or not and how to spot a fake. Then switch roles.

Use helper tech. Some apps, like Opal, Forest, Onesec or Freedom, let you track your social media use or set time limits. Edward Thomas recommends asking yourself: “When do I actually stop feeling the happiness, excitement or enjoyment that I get from social media?” Then set that as your time limit. Thomas is a 19-year-old in Cambridge, Mass., and a teen mentor with GoodForMEdia.
Deleting or offloading apps from your phone can also help in stressful times, like during exams, says Rachel Hanebutt. Limit notifications or use nightshift or grayscale in your phone’s settings. Hanebutt is a developmental psychologist at Georgetown University in Washington, D.C.

Perhaps the most powerful way to make algorithms work for you is to design your own.
Maybe you’ll be choosing a college soon. You could start by reviewing data at popular college-guide sites. But don’t use their algorithms. Make your own! Select the inputs that matter to you. Those might be cost, a good theater program or a chance to do marine-science research. Apply a weight to each input and rank each school on each one. Then calculate some weighted sums and compare.
You can use the same approach to make other decisions too, from how to celebrate your birthday to what instrument to play in the school orchestra. When you take more control over your choices, putting more of your own thoughts and opinions into your life decisions, they — and you — will feel better.
Algorithms are all around us, including in our heads. Now that you know how they work and some of their pitfalls, you can make them work for you.





