And also thanks for the new version from 2025-06-30 you posted on your site! The layout dividing the locations of abavia and omicron was a nice addition, as was the ntd/dmca info for all backbones. Good work!
privadesco
From what I understood you are trying to get access to international languages media.
Omicron's newzlazer, like easynews search, is useful because it can deobfuscate some stuff that you can't find manually in some indexers, but if you are looking for international content, what will matter for you are what indexers you have access. Here are some options:
- SceneNZBs - specially for content in German.
- Dog - Good (better than usual at least) curated, and sometimes unique, international media source.
- Finder - Better than usual source for international media.
- Su - You probably already notices that, but also good for international languages.
- Ninja - Like dog, has some unique content.
- WtF[nZb] - (not to be confused with the other one with similar name) Good for international releases but very hard to interact with (complicated search and limited API). Easynews' Web Search somehow deobfuscate some of their releases. Some Scandinavian content.
- DrunkenSlug - Also good for internation releases.
- Spotweb (clubnzb, nzbstars) - Good for Dutch content.
Search for "Non-Mainstream Movies Statistics in Usenet Indexers" at german google should find you a analysis comparing availability of hundreds of releases separated in about 30 languages (it is a handmade study, not some crazy prowlarr stats).
Thanks for the explanation. I didn't know about the peering being only for new content. I was thinking of simply downloading the full feed of another provider and adding it to their own (automatically, somehow). But if there is small print avoiding that, then it can't be done.
I was thinking about the storage cost only.
By peering older days from the usenet itself? Doing it from nzbs could also be something. There is a lot of old content at many indexers.
The major reason I feel trapped to Omicron is that I enjoy (no so much) cult content from every possible place and language. And a big part of it is 2008~2013.
I see all the NewsDemon advertising and way to do business and I feel sad to be unable to just drop omicron for my use case.
Yes, geek has a very nice automated API to upload a bunch of nzbs at once, but it requires a paid account for that. I will look into crawler, I didn't know they allowed it too. I know other place that allows but I'm too afraid to use it, and also I wanted things to become broadly available, not exclusively locked to a single indexer.
It would be nice to have a place that every major indexer would pick from, instead of contacting each one. It's not very popular things, but would be great for preservation of media for the future.
Well, I believe I'm already not being very silent...
Nice api! Thanks for the link!
About the js, don't get me wrong, the site looks gorgeous as it is (with the js)!
It's just for no-script users, the idea of having it rendering a simple nojs if the js can't be loaded is always nice. But take that as niche feedback.
About having it inside the SVG, well that is the only thing I miss from the old versions. It was a really nice way to visualize, specially offline. But I understand that the tree with the filters is far superior, and I thank you for that ;) .
It looks beautiful as it is, but do you consider adding the table data to the SVG itself so we can keep it all as a one-thing downloaded for offline reading in the future?
Also, The table filtering features are really nice and help a lot, and I know it requires javascript for doing so, but would you consider maintaining a no-js copy of the full table without filters as an option?
Anyways thanks for keeping this as is, already!
@[email protected] What opsec to use with it?
Just mullvad and a xmr posting block?
What about how to make the nzbs show at the indexers?
Any guides on posting that you would recommend?
(PS: btw, I am a silent lover of your site, keep the good work, and congratulations for it)
Assuming you are from NewsDemon, I have a question, since newsdemon is a co-founder of a backbone (UsenetExpress):
Since the daily feed size bloated so much, isn't everyday cheaper to add a lot of older content instead of keeping retention up to date to the most?
Do you guys think about doing so?
I know you share stats about only a small percentage of downloads being old data, but there are so much good content available at old usenet that is lost to any other form of download these days (ISOs of full old DVDs that were never relaunched as Blu-Ray for instance).
Do you guys consider picking it up in the future? I mean, going some days back in retention looks cheaper that keeping a new day up. Ain't it?
Ninja Central went paid only, all free accounts have been deleted, even if it was once paid in the past.
New accounts are now 14days trials.
20U$ per year, 50U$ for 3 years. There are no free accounts, only 10day trials that requires an invite when they are not open.
Yes! Now every backbone has it's ntd/dmca icon and pure nl omicron is easier to distinguish. It's really better! Thank you again!
One question: Omicron NL, Base IP and Eweka backbones are all the same AS34343 number. Are they the same backbone?
Another thing that makes me think that is possible: if you use eweka servers as a vpn endpoint, myip.wtf will say the IP belongs to "Eweka Internet Services B.V." while myip .com will say "Base IP B.V." to the same assigned address.