Personally, I run ttrss and use a plugin that scrapes the content from the source site and embeds it right in my feed. This is because they can't advertise in an RSS feed so they want to drive eyeballs to the site. This isn't because of bandwidth concerns. > As a result, some blogs (like christine.website) stopped serving article contents in the RSS feeds. In short: this should be a non-issue unless you're turning out massive amounts of new material on a regular basis, in which case you're probably a major news organization and can afford it. Feed readers will poll that and pull in new content while ignoring existing content. So a) no, it doesn't contain images, just URLs to images where necessary an RSS or ATOM feed is just a text XML document adhering to a certain schema, and b) the feed file is only as big as the site decides it needs to be.įor example, a typical blog or news organization could generate a feed containing the last 24 hours worth of new content. > But I've heard that it requires creating a massive feed file containing many articles, including text and images It's a better deal from a cost-perspective than those 30-some people checking each day. Feedly is making regular requests during the day but has 30-some users behind those requests. The cool thing is that a lot of these RSS readers are shared (commercial feed readers, ttrss instances, etc) and so they are caching the feed on their end and actually saving me bandwidth vs an equivalent number of users accessing the site directly - a lot of these put the subscriber account in their UA string so I can see that e.g. Despite accounting for a large percentage of requests, RSS clients account for only a small portion of traffic. RSS readers scrape surprisingly often, but the file is small and highly cacheable, and most of them do a HEAD request only initially and never ask for the file if it hasn't changed. The actual reason has always been getting users onto the site where you can deliver advertising more flexibly along with analytics.įor my extremely minimal site, the RSS feed is still smaller than normal pageloads because it only has graphics by reference. I don't think performance/load has ever been a real reason to only serve partial content. But if the bulk of the money and audience shifts to those walled gardens due to their owning the biggest names in the business while enabling the kind of surveillance capitalism that's so in demand, eventually the ecosystem built up around RSS is in danger of drying up. The simple fact is, thanks to their walled garden, app-centric platforms, they can collect more behavioural data and target advertising more effectively, which means they draw advertisers away and make it a lot harder for everyone else to compete.ĭo these headwinds stop the small players from continuing to publish via RSS? No, not at all. Meanwhile, Spotify and Apple have a huge advantage over RSS-based podcasts: data. ![]() I won't be in the least bit surprised if those existing acquisitions are eventually forced to shut down their RSS feeds and go exclusive, and for new acquisitions to start out that way, as Netflix has proven content is the way to drive more subscribers to their platforms. The whole podcasting space is going through a period of massive consolidation as podcast networks get bought up by the likes of Spotify and Apple (e.g.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |