I find a lot of sites feel like they're overtuning their recommendation engines, to the detriment of using the site. YouTube is particularly bad for this - given the years of history and somewhat regular viewing of the site, I feel like it should have a relatively good idea of what I'm interested in. Instead, the YouTube homepage seems myopically focused on the last 5-10 videos I watched.
The problem is the economics of the Internet today. Most sites are ad-funded. They need to maximize pageviews and time on site. The recommender is a huge part of accomplishing that, and it simply won't be tuned to any metric other than maximizing revenue. The solution would be to reskin popular sites with recommenders that had other objectives, such as retrospective satisfaction with time spent. Unfortunately sites know that if they give up control of the UI through an API that is open, they will lose money when people do things like this.
I really think we're entering a crisis here where sites and apps that are engineered to maximize corporate metrics are leading people down horrible paths of addiction, and psychic stress as they spend their mental energy resisting temptations constantly thrown at them.
Thought just occurred: would I be willing to pay a subscription to have the recommender tuned to remove revenue maximisation and site-addiction maximisation? Would anyone?
I've made plenty of "remove ads" in-app purchases on my phone. This isn't too different. And it might actually result in a truly useful experience.
I'm not sure if it is because of weird tastes or something else. But for sites like Steam, YT, Netflix I always wish for more tuning parameters because their recommendations are all horrible.
On Netflix, for example, I get recommended stuff that's similar to other things I didn't like.
Amazon is often the best of them for books because their engine seems to only really value the last few things I looked at / bought and so just recommends really similar stuff. Of course, that fails hard whenever I look at something I'm not interested in for whatever reason.
On Steam I wish they had some elaborate filtering system, there are so many games, I need something like NOT(Adventure+Puzzle) BOOST(RPG), I need to combine tags, not just filter them out by themselves.
> for sites like Steam, YT, Netflix I always wish for more tuning parameters because their recommendations are all horrible.
The problem with those recommendation engines is that they're not optimized to serve your needs. They're optimized to serve the goals of their respective companies. And the problem comes when your needs are a little incompatible with the needs of the company.
Consider Netflix for example. Their recommendations don't seem to care much about what you actually enjoy the most. They pay different amounts of money to let you watch different content. So your goal of watching the thing you would enjoy the most is different from whatever the hell their goals are-- probably to get you to watch just enough Netflix to make you not want to cancel your subscription, but not run up their bills on bandwidth and licensing fees.
I'm absolutely confident Netflix could make amazing recommendations-- and probably already has them internally. But it's not in their best interest to give recommendations that are in your best interest.
To the extent that I'm right about this it could be a market opportunity to make an honest and useful recommendation service.
Netflix, I think, has killed their own recommendation algorithm when they removed stars and made it boolean. I don't know if those buttons even do anything anymore because I think they're just matching based off demographic and who they're trying to market to now. It's recommending shows that I never would watch in a million years and giving them high matches despite me disliking most similar shows just because I'm a 20-something male.
All the technology hype aside, it often feels as if these feed prediction algorithms are akin to weather forecasting. That is, there not forecasting per se, but taking educated guesses based on some set of knowns. The problem is correlation is not cause.
Part of that trend is because recommendation algorithms aren't all that interested in making recommendations anymore. They're interested in getting sales or views. To do this, they use dumber algorithms that are easier to understand ("people who bought this also bought...") and over-value recency. They're not helping you find new content, they're trying to prolong the time you're on youtube.
I have been paid to write an algorithm like this. It matched you to relevant content, but then a manually configured weighting table would skew results to more popular sources. It went as far as taking into account the number of leads we had sent each partner that month as a percentage of their quota
Youtube is definitely over-weighing the last few videos. Then again, it's likely optimized for a different market segment. Especially with how many children are given tablets. Like, brief intense fascination with many subjects in general is genuinely hard to optimize for, especially when a huge portion of your market has long-lasting intense fascination with a few subjects.
This is often the behavior I want. I frequently queue up a song on YouTube and then let it keep auto-playing, if I hit one I don't care for I usually select off the sidebar of recommendations.
This only works because YouTube stays focused on what you're doing without trying to build a bigger picture of who you are. It would be nice if it could somehow do both, but I don't have a clear picture of how that UI would function.
YouTube's home page system is quite hackable in a way, but you need to use the search or other recommended videos to change your viewing pattern.
In other words it doesn't consider you as someone with a long history. You can change your profile from a conservative to a liberal in a few hours watching videos. Whether it's possible to have a balanced amount of crazy ( not the same as a centrist) is something I'm currently working on, it requires effort!
YouTube would rather you fall into a trance of continuing your recent viewing habits rather than providing a personalized library. It just makes better business sense.
As largely a geeky population, perhaps we do need to accept that our interests are more varied and often actually more intense than those of the average person. Being mildly amused by every fart joke movie available for some people is more rewarding than watching an emotionally challenging drama, a thoughtful comedy, and eight documentaries about four topics.
It is not all that bad. If I listened to three metal songs last few videos, I am likely to want some more metal in the next one. I am unlikely to want jazz or comedy sketch or kiddy cartoon (despite these being seen from my account last few days).
> Instead, the YouTube homepage seems myopically focused on the last 5-10 videos I watched.
Noticed the same thing happening for the past year or two.
I used to be able to go on YouTube and find a variety of interesting videos I'd never seen before. It seemed like there was a good balance of categories in the recommendations.
Now, I watch one boating video and suddenly my recommendations are 100% boat related with some random clickbait/viral garbage sprinkled in for good measure.
YouTube is weird. I made the mistake of watching a flat earth video last year in order to get a handle on that growing weirdo fad. Well it turns out that flat earthers are really avid content consumers (presumably because of the enormous cognitive reinforcement requirements to maintain such a belief) and YouTube really wanted to help me with that. It took a couple of months for it to stop offering me a portal to a better, flatter world every time I refreshed a page.
Once a video is removed from your history, you'll no longer get recommendations based on it.
If you aren't logged in, you can't view or edit your viewing history, but you can at least clear it. (Which will return you to YouTube's terrible default viewer profile... sigh)
I similarly had that happen with some GamerGate stuff. Watched a video just to see if I could wrap my head around it, went "Nope!" and went on with my life, and the next thing I knew YouTube was desperately trying to send me down the alt-right rabbit hole.
It's a hard problem. If they kept recommending old topics, you'd be like "hey, I quit watching pink zebra videos 5 years ago, stop recommending that stuff to me".
My supposition is that this is a response to changing tastes over time. However, I think 10 videos or less it probably an overreaction to this problem.
For example, let's say they have a user who watches 100 videos a week, for ease of math. 50 of those videos are in "core" areas of interest - these do not change over time. An additional 35 are in areas of secondary interest: topics which have piqued the viewer's curiosity, but not deeply interested them. We can expect these topics to change every [1,4] weeks. The remaining 15 are referrals or clickbait from other websites.
How can YouTube differentiate between these three classes of videos? The first class will be heavily represented in their subscriptions. Presumably, the viewer will prefer their recommendations to ignore the third class (clickbait). The second class is the hardest, as the user may want these videos surfaced, but then want them to decay over time as their interests change.
I think this is the problem that they are attempting to solve, with varying degrees of success.
It's not just over tuning, but finding me things I might have missed. Not necessarily popular things, but thinks that appeal to me.
For example, I get emails from Pocket "You saved a popular article..." And I think "Who cares?" Don't tell me what I know, tell me what I missed. YT is similar. It recommends things I've watched. I'm looking for new and interesting and I'm getting yesterday's news? That doesn't excite me.
I hate this. Several years of nothing but a single genre of music in my history but watch one video about something different and get nothing but recommendations based on that. I have to clean up my watch history daily.
I've moved pretty much all my music listening to YouTube due to simple convenience of easily being able to generate whole genre playlists based on one song.
But because the playlists are dynamic, YouTube keeps on shoving songs from other playlists (and genres) into each other, trying to generate a "perfect playlist" and in the process making all the playlists sound very similar with no more genre distinctions except for the first couple of songs.
yeah, but i get the impression history is retained and utilized. If i watch 2-3 how-to videos i start getting how-to videos that are similar to my history, not just random how-to. then if i switch to skateboarding, i get skateboarding videos similar to my history.
This seems like a good algorithm to me, as when i'm watching skateboard videos with my friends, I don't want "how to caulk tile joints" to show up
Well, it sometimes makes sense. I, and many others, use YouTube as a music streaming service. Most people listen to most songs they like more than once. I dare say yt has gotten a good amount of ad revenue out of me listening to Darren Emerson's dub extravaganza mix of black sky by Shakespeare's sister, which is always in the recommendation panel for some reason...
I concur that.. these days I have to clear my youtube history every few weeks to prevent it from spamming my page (and my fucking TV!) with suggestions related to random videos I watched over a short period of time. The videos I liked over the last 7-8 years seem to carry less weight. I wish there was a way to tune these behavior.
If I had to guess, I would say YouTube is attempting to get you to watch new types of videos you've never watched before. By focusing on recent videos you've watched, they can try to convert you from a 1-off to a regular viewer of a particular genre.
i think this is the right approach, you're not after all, the person you were 5 years ago, or even a week. most interests change with time, I lose interest in a topic very fast, and
gain interest in another topic just as quickly. No good recommender system should be based off a reading of my "personality", whatever that may be - the most stable aspects of my personality, even if they can be divined from my viewing history, say little as to what I would be interested in watching next.
I think that's a fair point, and I wouldn't expect a video I watched 10 years ago to factor in very heavily on what I'm seeing today. But at least in my experience, it doesn't appear that the engine takes anything that happened more than a few days ago into account.
As an example: I watched a few episodes of Penn and Teller: Fool Us yesterday. I hadn't really watched it before, and while I like Penn and Teller in general, I don't remember watching them all that much on YouTube prior to yesterday (I'm sure at some point I had watched a video or two, but not more so than anything else I stumbled on.)
Today, 12 of the top 30 videos on my YouTube home page are specifically Penn and Teller: Fool Us. Not magic in general, not Penn and Teller in general, but specifically that show. That seems like the very recent past is way overrepresented.
This is because they're not trying to give you stuff you'd generally like given your entire history. The recommendation feature powers the part of YouTube that auto-plays the upcoming video.
So e.g. if I go and view Russian dashcam videos they're going to automatically play more of them, even though I've shown no prior interest in that topic.
Having two systems for recommendations would introduce a lot of UI complexity, so I can see why they didn't go for that, and why the recommendations are consequently tuned for people who are actively watching videos on some topic right now.
Having two systems for recommendations would introduce a lot of UI complexity,
Do you mean, in terms of implementation or for the user? Because as the later, I think the two are already conceptually different (homepage vs next video), so I don't see how just feeding it different videos would make it more complex.
I imagine it would be possible (though difficult) to autoplay videos which are related to each other and gradually converge to something that would interest the viewer.
For example, Russian dash cams to Russia at night to the Russian sleep experiment creepypasta, to horror games to video games in general, if that is what the user tends to watch.
I know this has graph theory written all over it and the shortest-distance problem has wreaked havoc for centuries, but I think with enough resources Google/YouTube could find a good compromise in this situation.
tl;dr, as I understand it: when family members Like your Facebook content in relative quick succession, FB apparently interprets it as a signal that it is family-specific content. I didn't see any metrics but this seems plausible.
I think I'm more of a fan of FB than the average web geek, probably because I used it at its phase of peak innocence (college years) and have since weaned myself off to the point of checking it on a less-than-weekly basis. I also almost never post professional work there, nor "friend" current colleagues. Moreover, I've actively avoided declaring familial relationships (though I have listed a few fake relationships just to screw with the algorithm). But wasn't the feature of making yourself a "brand page" and/or having "subscribers" (which don't count toward the 5,000 friend limit) supposed to mitigate this a bit? I guess I'm so used to keeping Facebook solely for personal content (and using Twitter for public-facing content) that I'm out of touch with the sharing mechanics. That, and anecdotal experience of how baby/wedding pics seems to be the most Liked/Shared content in my friend network.
> But wasn't the feature of making yourself a "brand page" and/or having "subscribers" (which don't count toward the 5,000 friend limit) supposed to mitigate this a bit?
If you have a page, Facebook wants to milk you, so they've got this weird algorithm that pushes your posts to only a small fraction of the page's followers, expecting you to start "promoting" them.
Basically they screwed "organic reach".
Oh, and about 3 days ago I created a second Facebook account with the purpose of connecting with software developers and my English-speaking friends (I'm Romanian). I did this thinking that I don't want to share semi-private pictures of family with strangers, or to spam my family and friends with programming stuff.
But only 24 hours later they've disabled my account because of "security concerns", without notice and now I'm waiting on their support to reply after I've sent them my picture for validation.
And another thing - the online parents group from my son's school is on WhatsApp. They tried a Facebook group, but the problem is that when important announcements happen, not all parents receive notifications, so they resorted to something that works.
> But only 24 hours later they've disabled my account because of "security concerns", without notice and now I'm waiting on their support to reply after I've sent them my picture for validation.
I'm not a Facebook's biggest fan, but I think this is a valid security concern (I'm assuming you used the same name as your initial account). Cloning FB accounts and impersonation is a valid threat vector for getting inside someone's network: when accepting friend-requests, most people don't double check if they are already friends with the purported requester.
> If you have a page, Facebook wants to milk you, so they've got this weird algorithm that pushes your posts to only a small fraction of the page's followers, expecting you to start "promoting" them.
While this does enable Facebook to force brands to buy advertising, it is also an user-friendly change.
Nobody who I have spoken to about this thinks that "like/follow" = "I want to see everything they post" on Facebook. People like restaurants they had a nice dinner at and want to publish their support, it doesn't mean they want to see the daily special every day. Even if they "follow" you, it doesn't mean it should show up before a friend's holiday pictures today, and tomorrow there is already new content.
The reality is that most facebook users do not care about the posts from pages they like/follow, even if they pressed the button some time ago. An automated filter that keeps these posts from showing up is good for the user experience.
It's the same as "facebook friends" - they aren't real friends, just people you met once at a party in college. It might be interesting to see a post from them once in 10 years when they get married or move to a new country, but not their daily life. The same applies for brand pages - a like should give you once in 10 years access to their feed, but not more.
Ah, that explains a lot. I know very little about social media marketing and wondered why my attempts to launch a few pages related to art projects were drawing such little attention from my social circle. Being me I assumed it was because my art is shit and largely abandoned a project I'd spent months developing. Zucked again.
...and it causes the political echo chamber problem which results in extreme polarization of whole societies based on fake perception of what the other side is doing vs what they're actually doing. terrible indeed but generates views thus revenue.
> If you have a page, Facebook wants to milk you, so they've got this weird algorithm that pushes your posts to only a small fraction of the page's followers, expecting you to start "promoting" them.
> Basically they screwed "organic reach".
Average user follows way too many pages and has too many friends all posting to see all content every day.
Algorithm isn't here to milk you, it's to stop the news feed from becoming the MySpace bulletin feed (which was an effective way to post in 2007, so long as you posted 5x every hour)
I'd like Facebook to have an option to see all posts, without filtering, just as they're posted. It's not hard, it's a simple UX, but it's just not there.
I kind of understand the point on not showing everything: too much people producing too much content (and it's hard to decide whom to unfollow when everyone produces more or less the same percentage of quality and non-quality content), which led me few years ago to immediately abandon Twitter after just a few days. I followed a handful of people and have been just catching up with yesterday's tweets whole morning each day. This seems to be less of a problem those days but it might be subjective.
What makes me go nuts though is stuff disappearing from my FB wall (using mbasic web). Lately I see that a lot: see a few interesting entries on timeline, I click one, come back to the wall and they've disappeared and some random stuff from 3 days ago took their place and I can't find them again.
Same happens in YouTube Android app. See a few interesting videos, click one (I can't "open in new tab" in the app...), go back, they're gone and replaced with something totally unrelated. Ugh.
They used to do that. They had two different news feeds, one with their algorithm and one that was just a real-time feed. Then they scrapped the real-time feed in favor of their algorithm-based feed. It's not that the real-time was hard to do, it's that the algorithm is more easily monetized.
Moving the feed from 'chronological' to 'algorithmic' obfuscates the true quantity of actual content on the user's timeline, thereby allowing a higher density of ads and other sponsored content without the user necessarily being able to tell or prove.
I hypothesized about other benefits and drawbacks here [1].
Yup. It was a huge deal in, what, 2008, when they did this? Almost as big as when they got rid of curated content and turned everyone's link-able "interests" into weird generic "likes" (it was a really strange period for at least a few months as I recall).
For awhile you could switch back every time you logged in, manually, to a chronological news feed.
Instagram was one of the last holdouts but they switched a year or two ago.
Twitter is amazingly bad at this. It takes screenfuls of junk on mobile to get to the chronological stuff.
There used to be a Feed API, but it was removed for privacy reasons... people (understandably) didn't like that their friends could allow another company to read their posts.
It's probably a core part of their scalability that they never have to produce a complete list of posts. Every fetch is on best effort basis within a small time constraint.
They don't even produce complete search results. Go to a group with a long history, and use "search this group" for a keyword that will return over, say, 100 results--some of them a couple of years old.
You won't see many of the older posts, and facebook gives no indication that they have been pruned. It's very hard to find old things without just scrolling through chronologically, which is slow and error prone.
No, that's still from your own bubble that you see on "Top Stories", just ranked "newest first". Not the full list you've subscribed to (as in, every single page and every single friend).
I believe if you create a "group" and then add your entire friends list to it (this involves lots of clicking btw, take a few letters of the alphabet per day) you can accomplish this if you really want.
I feel like these are just shitty models. A good recommendation model would get features like "is_mom" and learn that "is_mom" is a shitty predictor of relevance.
Similarly with Amazon, products should have some sort of 'elasticity' score where it should learn that recommendations of inelastic products is a waste of screen real-estate. I mean, I doubt the model is giving a high % to most of those recommends - it's likely more a business/UX issue in that they've decided it's worth showing you low-probability recommends instead of a cleaner page (or something more useful).
Youtube, on the hand, seems to be precision tuned to get you to watch easy to digest crap. You consume the crap voraciously but are generally left unfulfilled. This is a more difficult problem where you're rewarding naive views rather than a more difficult to discern 'intrinsic value' metric. As a 'long term' business play the model should probably weight more intellectually challenging content just like fast food restaurants should probably figure out how to sell healthier food because by pedaling crap you're only meeting consumer's immediate needs, not their long term ones.
Yesterday I sent one of my friends a link to an old - 4.5 years old, from 2013 dec - entry he wrote as a Facebook note. There were 70+ likes, 30+ commenters and 110 comments on it.
He added a new comment yesterday - I only saw it, because I randomly decided to read through the comments.
Those who commented on it should have received a notification - well, in the end, 2 people got something.
This is how you effectively kill conversation - which dazzles me, because keeping conversations running = engagement, which is supposed to be one of the end goals.
I get the "need" of a filter bubble, even though I'd simple let people choke on the amount of crap they'd get if follow would actually mean get everything - they may learn not to like things without thinking.
But not sending notifications at all? Why? Why is that good?
My guess is the point of those notifications is that it gives them an excuse to add points to the addictive little red number. It's not because they think you'll actually care.
Facebook also has a serious problem in that its news feed is a content recommendation engine with only positive reinforcement but no negative reinforcement. So you end up with a ton of false positives even when actively interacting with the content, and their system doesn't even know how wrong it is.
And should you really not like some content, the solution is unfriending the poster, rather than simply matching against that type of content (political, religious, etc).
The fact there isn't a private dislike button (that no one sees you clicked other than Facebook), is remarkable at this point. It's either woefully obtuse, or intentional so that a feed of false positives better hides moderately targeted ads.
"The fact there isn't a private dislike button (that no one sees you clicked other than Facebook), is remarkable at this point. It's either woefully obtuse, or intentional so that a feed of false positives better hides moderately targeted ads."
I vote the last option. I think there was an explanation, basically saying it would make people upset or sad and they wanted to avoid that. The solution wound up being a variety of emotions to pick from (which counts more than a simple like). They included negatives like "angry" or "sad".... I'd still just rather have a simple dislike button.
Not really? It's a very different system, there's no plausible way to have a "private" downvote when the "recommendation system" here is entirely crowdsourced rather than personalized. I suppose the hide button is sort of an equivalent, but I don't think the systems are really comparable.
Pruning my friends improved my experience, but they grow back over time as people rediscover you or you make new acquaintances. I wonder if you could make a game or website that forced you to prune your facebook friends down, perhaps by looking at your social graph and telling you that you haven't messaged with a person in x years or something, or that you've never even "liked" something they posted. Some heuristics to determine who you really wouldn't miss. The problem with unfriending is that Facebook's UI makes it almost impossible to do in batches, and it feels kind of rude to "unfriend" someone, even though I personally wouldn't care if someone I don't talk to unfriended me.
A game rewarding you for removing Facebook friends? Burger King did that years ago, got quite a bit of exposure thanks to it: sacrifice 10 friends, get a Whopper:
Here's a trick to constantly pruning down your friends list. Don't worry about pruning down your whole list, just check the people who have birthdays every day you log in. I just checked; there was a single friend who had a birthday today, and I haven't had any contact with her in years. That makes for a nice easy prune.
Of course the downsides are that you risk disappointing someone who obsessively checks their friend count on their birthday, and that it only really works if you log in every day.
I did this. One day I just decided to make my Facebook account family only - removed hundreds of "friends". It worked great for me. I don't idly check it so often (because there are fewer updates), but I still get to see what people I care about are up to.
I really think we're entering a crisis here where sites and apps that are engineered to maximize corporate metrics are leading people down horrible paths of addiction, and psychic stress as they spend their mental energy resisting temptations constantly thrown at them.
I've made plenty of "remove ads" in-app purchases on my phone. This isn't too different. And it might actually result in a truly useful experience.
On Netflix, for example, I get recommended stuff that's similar to other things I didn't like.
Amazon is often the best of them for books because their engine seems to only really value the last few things I looked at / bought and so just recommends really similar stuff. Of course, that fails hard whenever I look at something I'm not interested in for whatever reason.
On Steam I wish they had some elaborate filtering system, there are so many games, I need something like NOT(Adventure+Puzzle) BOOST(RPG), I need to combine tags, not just filter them out by themselves.
The problem with those recommendation engines is that they're not optimized to serve your needs. They're optimized to serve the goals of their respective companies. And the problem comes when your needs are a little incompatible with the needs of the company.
Consider Netflix for example. Their recommendations don't seem to care much about what you actually enjoy the most. They pay different amounts of money to let you watch different content. So your goal of watching the thing you would enjoy the most is different from whatever the hell their goals are-- probably to get you to watch just enough Netflix to make you not want to cancel your subscription, but not run up their bills on bandwidth and licensing fees.
I'm absolutely confident Netflix could make amazing recommendations-- and probably already has them internally. But it's not in their best interest to give recommendations that are in your best interest.
To the extent that I'm right about this it could be a market opportunity to make an honest and useful recommendation service.
"Oh, you bought a wallet? Here are twenty wallets of the exact same type but with slightly different colors that you might want to buy, too"
This only works because YouTube stays focused on what you're doing without trying to build a bigger picture of who you are. It would be nice if it could somehow do both, but I don't have a clear picture of how that UI would function.
In other words it doesn't consider you as someone with a long history. You can change your profile from a conservative to a liberal in a few hours watching videos. Whether it's possible to have a balanced amount of crazy ( not the same as a centrist) is something I'm currently working on, it requires effort!
Of course it's possible they have, and realised they still make more money by focusing on the more recent videos.
Noticed the same thing happening for the past year or two.
I used to be able to go on YouTube and find a variety of interesting videos I'd never seen before. It seemed like there was a good balance of categories in the recommendations.
Now, I watch one boating video and suddenly my recommendations are 100% boat related with some random clickbait/viral garbage sprinkled in for good measure.
https://www.youtube.com/feed/history
Once a video is removed from your history, you'll no longer get recommendations based on it.
If you aren't logged in, you can't view or edit your viewing history, but you can at least clear it. (Which will return you to YouTube's terrible default viewer profile... sigh)
For example, let's say they have a user who watches 100 videos a week, for ease of math. 50 of those videos are in "core" areas of interest - these do not change over time. An additional 35 are in areas of secondary interest: topics which have piqued the viewer's curiosity, but not deeply interested them. We can expect these topics to change every [1,4] weeks. The remaining 15 are referrals or clickbait from other websites.
How can YouTube differentiate between these three classes of videos? The first class will be heavily represented in their subscriptions. Presumably, the viewer will prefer their recommendations to ignore the third class (clickbait). The second class is the hardest, as the user may want these videos surfaced, but then want them to decay over time as their interests change.
I think this is the problem that they are attempting to solve, with varying degrees of success.
For example, I get emails from Pocket "You saved a popular article..." And I think "Who cares?" Don't tell me what I know, tell me what I missed. YT is similar. It recommends things I've watched. I'm looking for new and interesting and I'm getting yesterday's news? That doesn't excite me.
But because the playlists are dynamic, YouTube keeps on shoving songs from other playlists (and genres) into each other, trying to generate a "perfect playlist" and in the process making all the playlists sound very similar with no more genre distinctions except for the first couple of songs.
This seems like a good algorithm to me, as when i'm watching skateboard videos with my friends, I don't want "how to caulk tile joints" to show up
But for me it is just silly.
As an example: I watched a few episodes of Penn and Teller: Fool Us yesterday. I hadn't really watched it before, and while I like Penn and Teller in general, I don't remember watching them all that much on YouTube prior to yesterday (I'm sure at some point I had watched a video or two, but not more so than anything else I stumbled on.)
Today, 12 of the top 30 videos on my YouTube home page are specifically Penn and Teller: Fool Us. Not magic in general, not Penn and Teller in general, but specifically that show. That seems like the very recent past is way overrepresented.
So e.g. if I go and view Russian dashcam videos they're going to automatically play more of them, even though I've shown no prior interest in that topic.
Having two systems for recommendations would introduce a lot of UI complexity, so I can see why they didn't go for that, and why the recommendations are consequently tuned for people who are actively watching videos on some topic right now.
Do you mean, in terms of implementation or for the user? Because as the later, I think the two are already conceptually different (homepage vs next video), so I don't see how just feeding it different videos would make it more complex.
For example, Russian dash cams to Russia at night to the Russian sleep experiment creepypasta, to horror games to video games in general, if that is what the user tends to watch.
I know this has graph theory written all over it and the shortest-distance problem has wreaked havoc for centuries, but I think with enough resources Google/YouTube could find a good compromise in this situation.
I think I'm more of a fan of FB than the average web geek, probably because I used it at its phase of peak innocence (college years) and have since weaned myself off to the point of checking it on a less-than-weekly basis. I also almost never post professional work there, nor "friend" current colleagues. Moreover, I've actively avoided declaring familial relationships (though I have listed a few fake relationships just to screw with the algorithm). But wasn't the feature of making yourself a "brand page" and/or having "subscribers" (which don't count toward the 5,000 friend limit) supposed to mitigate this a bit? I guess I'm so used to keeping Facebook solely for personal content (and using Twitter for public-facing content) that I'm out of touch with the sharing mechanics. That, and anecdotal experience of how baby/wedding pics seems to be the most Liked/Shared content in my friend network.
If you have a page, Facebook wants to milk you, so they've got this weird algorithm that pushes your posts to only a small fraction of the page's followers, expecting you to start "promoting" them.
Basically they screwed "organic reach".
Oh, and about 3 days ago I created a second Facebook account with the purpose of connecting with software developers and my English-speaking friends (I'm Romanian). I did this thinking that I don't want to share semi-private pictures of family with strangers, or to spam my family and friends with programming stuff.
But only 24 hours later they've disabled my account because of "security concerns", without notice and now I'm waiting on their support to reply after I've sent them my picture for validation.
And another thing - the online parents group from my son's school is on WhatsApp. They tried a Facebook group, but the problem is that when important announcements happen, not all parents receive notifications, so they resorted to something that works.
Facebook is freaking terrible.
I'm not a Facebook's biggest fan, but I think this is a valid security concern (I'm assuming you used the same name as your initial account). Cloning FB accounts and impersonation is a valid threat vector for getting inside someone's network: when accepting friend-requests, most people don't double check if they are already friends with the purported requester.
While this does enable Facebook to force brands to buy advertising, it is also an user-friendly change.
Nobody who I have spoken to about this thinks that "like/follow" = "I want to see everything they post" on Facebook. People like restaurants they had a nice dinner at and want to publish their support, it doesn't mean they want to see the daily special every day. Even if they "follow" you, it doesn't mean it should show up before a friend's holiday pictures today, and tomorrow there is already new content.
The reality is that most facebook users do not care about the posts from pages they like/follow, even if they pressed the button some time ago. An automated filter that keeps these posts from showing up is good for the user experience.
It's the same as "facebook friends" - they aren't real friends, just people you met once at a party in college. It might be interesting to see a post from them once in 10 years when they get married or move to a new country, but not their daily life. The same applies for brand pages - a like should give you once in 10 years access to their feed, but not more.
> Basically they screwed "organic reach".
Average user follows way too many pages and has too many friends all posting to see all content every day.
Algorithm isn't here to milk you, it's to stop the news feed from becoming the MySpace bulletin feed (which was an effective way to post in 2007, so long as you posted 5x every hour)
Edit: I just made a super simple bookmarklet that does it: http://mikelyons.org/2017/06/12/Google-Cache-Javascript-Shor...
Dead Comment
What makes me go nuts though is stuff disappearing from my FB wall (using mbasic web). Lately I see that a lot: see a few interesting entries on timeline, I click one, come back to the wall and they've disappeared and some random stuff from 3 days ago took their place and I can't find them again.
Same happens in YouTube Android app. See a few interesting videos, click one (I can't "open in new tab" in the app...), go back, they're gone and replaced with something totally unrelated. Ugh.
I hypothesized about other benefits and drawbacks here [1].
[1] https://news.ycombinator.com/item?id=12190724
For awhile you could switch back every time you logged in, manually, to a chronological news feed.
Instagram was one of the last holdouts but they switched a year or two ago.
Twitter is amazingly bad at this. It takes screenfuls of junk on mobile to get to the chronological stuff.
You won't see many of the older posts, and facebook gives no indication that they have been pruned. It's very hard to find old things without just scrolling through chronologically, which is slow and error prone.
Drives me insane.
[0] http://imgur.com/a/gg2DH
FWIW I would like that option as well.
Similarly with Amazon, products should have some sort of 'elasticity' score where it should learn that recommendations of inelastic products is a waste of screen real-estate. I mean, I doubt the model is giving a high % to most of those recommends - it's likely more a business/UX issue in that they've decided it's worth showing you low-probability recommends instead of a cleaner page (or something more useful).
Youtube, on the hand, seems to be precision tuned to get you to watch easy to digest crap. You consume the crap voraciously but are generally left unfulfilled. This is a more difficult problem where you're rewarding naive views rather than a more difficult to discern 'intrinsic value' metric. As a 'long term' business play the model should probably weight more intellectually challenging content just like fast food restaurants should probably figure out how to sell healthier food because by pedaling crap you're only meeting consumer's immediate needs, not their long term ones.
He added a new comment yesterday - I only saw it, because I randomly decided to read through the comments.
Those who commented on it should have received a notification - well, in the end, 2 people got something.
This is how you effectively kill conversation - which dazzles me, because keeping conversations running = engagement, which is supposed to be one of the end goals.
I get the "need" of a filter bubble, even though I'd simple let people choke on the amount of crap they'd get if follow would actually mean get everything - they may learn not to like things without thinking.
But not sending notifications at all? Why? Why is that good?
$friendINeverTalkTo has commented on $PostByPageIDontFollow is NOT something I want to get notified about. Neither is
XYZ has uploaded a photo
And should you really not like some content, the solution is unfriending the poster, rather than simply matching against that type of content (political, religious, etc).
The fact there isn't a private dislike button (that no one sees you clicked other than Facebook), is remarkable at this point. It's either woefully obtuse, or intentional so that a feed of false positives better hides moderately targeted ads.
It's to the point now where I just log out so I can see what's available rather than a continuous feedback of what I already watched.
That solution doesn't require defriending or unliking a page. Facebook uses this feedback when recommending content
I vote the last option. I think there was an explanation, basically saying it would make people upset or sad and they wanted to avoid that. The solution wound up being a variety of emotions to pick from (which counts more than a simple like). They included negatives like "angry" or "sad".... I'd still just rather have a simple dislike button.
...note that I can choose to dislike the post itself, or hide the entire source or app if I prefer.
https://www.cnet.com/news/delete-10-facebook-friends-get-a-f...
Of course the downsides are that you risk disappointing someone who obsessively checks their friend count on their birthday, and that it only really works if you log in every day.