Self-Hosted Alternatives to Popular Services

213 readers
2 users here now

A place to share, discuss, discover, assist with, gain assistance for, and critique self-hosted alternatives to our favorite web apps, web...

founded 2 years ago
MODERATORS
26
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/BinaryCheckers on 2025-01-29 21:25:21+00:00.


Is there some sort of service or docker container I could use to manage logins for all the apps I'm hosting? Like entra ID but for self hosted stuff. I guess I could just do an HTML page with a login prompt that leads to all the subdomains but I would prefer not to do the work myself if possible.

27
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Alex_Watch on 2025-01-29 18:50:51+00:00.

28
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Brief-Zucchini-180 on 2025-01-29 13:50:54+00:00.


Hey everyone,

Since a lot of people asked for a guide on how to build a RAG with DeepSeek-R1, here’s a full guide on how to RAG system using Ollama, DeepSeek, LangChain, ChromaDB, and Streamlit.

Let me know if you have access to it, I want to make sure it's free for everyone and helpfull for the community. Thanks for the support!

29
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Zestyclose_Car1088 on 2025-01-29 18:08:07+00:00.


And what do you spend the majority of your time on regarding server management

30
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/merox57 on 2025-01-29 15:53:06+00:00.


Hello everyone, I’m just curious about what kind of static project websites you host. I’m referring to things like Hugo, IT tools, and similar platforms.

I’m particularly interested in platforms that are easy to set up and can be safely exposed to the internet, whether self-hosted or using services like GitHub Pages.

31
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/UnitedRecognition505 on 2025-01-29 15:09:51+00:00.


So I want to buy a domain for just personal project to share with my team for all I care it could be named "sadsadfdsgds.in or .com " or whatever. Is there any place that provides these domains cause I went on to GoDaddy I asked for a domain for 10 years and the price was high considering the name i put out there. I there any provider who provides these at cheaper prices less than 25$.

EDIT: Not looking for .com specifically anything would work

32
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Anarch33 on 2025-01-29 06:50:22+00:00.


Here's the link:

I usually use Shlink which is great but I found it a little too heavy especially for my vps'es that often only have 256mb of ram. All I want is the ability to redirect links and see how many clicks they get over time so here it is.

With the shortened links, you can add ?source=anyTextHere at the end of it to have the source show up as anyTextHere in the stats section.

If you want, you can bypass the frontend and use the backend routes yourself as an API too. API.md has info on it

33
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/AhmedBarayez on 2025-01-28 22:09:19+00:00.

34
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/yoracale on 2025-01-28 18:51:02+00:00.


I've recently seen some misconceptions that you can't run DeepSeek-R1 locally on your own device. Last weekend, we were busy trying to make you guys have the ability to run the actual R1 (non-distilled) model with just an RTX 4090 (24GB VRAM) which gives at least 2-3 tokens/second.

Over the weekend, we at Unsloth (currently a team of just 2 brothers) studied R1's architecture, then selectively quantized layers to 1.58-bit, 2-bit etc. which vastly outperforms basic versions with minimal compute.

  1. We shrank R1, the 671B parameter model from 720GB to just 131GB (a 80% size reduction) whilst making it still fully functional and great
  2. No the dynamic GGUFs does not work directly with Ollama but it does work on llama.cpp as they support sharded GGUFs and disk mmap offloading. For Ollama, you will need to merge the GGUFs manually using llama.cpp.
  3. Minimum requirements: a CPU with 20GB of RAM (but it will be slow) - and 140GB of diskspace (to download the model weights)
  4. Optimal requirements: sum of your VRAM+RAM= 80GB+ (this will be pretty fast)
  5. No, you do not need hundreds of RAM+VRAM but if you have it, you can get 140 token per second with 2xH100 which is even faster than DeepSeek's own API service.
  6. Our open-source GitHub repo: github.com/unslothai/unsloth

Many people have tried running the dynamic GGUFs on their potato devices and it works very well (including mine).

R1 GGUFs uploaded to Hugging Face: huggingface.co/unsloth/DeepSeek-R1-GGUF

To run your own R1 locally we have instructions + details: unsloth.ai/blog/deepseekr1-dynamic

35
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Alex_Medvedev_ on 2025-01-28 18:48:48+00:00.


Hello everyone, I want to introduce Pumpkin, A New Minecraft server software completely build from Scratch using the Rust programming language. Its stupidly fast and efficient and does also support Plugins.

While its not done yet, There are already many things implemented, including Chunk generation which gives 1:1 the same result as in the Original game (biomes and structures are missing). We did almost reverse engeer the entire game logic and rewrote it in an more performant way using Rust, It was a hell of an effort and is not done yet.

Regarding future game updates, We have an extractor (a mod) which produces JSON Files from the latest game data (Blocks, Items, Packets...), We then dynamically generate Rust code from these Files, There are a big effort to implement everything in such a way that upgrading is smooth and easy.

I would love to hear some feedback :D

GitHub:

Discord:

36
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/redwoodstory on 2025-01-28 16:07:09+00:00.


LitKeeper

GitHub Repo

This is a simple web app to save stories from Literotica to ePub. In my own workflow, I download stories to a home server running Calibre-Web-Automated, which renders the stories available to other devices through its OPDS functionality. From there, I can read on my tablet, phone, e-reader, etc.

This app includes the following features:

  • Renders a simple web page prompting the user for a Literotica URL to download
  • Retrieves story, converts to ePub, and saves to a predefined location (defined in Docker Compose file)
  • Bundles story category and tags into metadata
  • Generates a cover image showing the story title and author name
  • Identifies if the story is part of a series and bundles subquent stories into a single ePub
  • Provides an API to download stories directly from iOS shortcuts (useful for my own workflow to browse Literotica, find a story I like, and quickly save to my library via the Share menu)
  • (Optional) Sends Telegram notifications when the story is downloaded
  • (Optional) Provides somewhat extensive logging (helpful for debugging but can be disabled in Docker Compose file)

This is my first time publishing a docker image, so I look forward to any feedback! Hope this app is useful for others who enjoy the stories on Literotica

services:
  litkeeper:
    image: ghcr.io/redwoodstory/litkeeper:latest
    restart: unless-stopped
    ports:
      - "5000:5000"
    volumes:
      - ./epubs:/litkeeper/app/data/epubs
      - ./logs:/litkeeper/app/data/logs
    environment:
      # Optional logging controls
      - ENABLE_ACTION_LOG=true    # Set to false to disable action logging
      - ENABLE_ERROR_LOG=true     # Set to false to disable error logging
      - ENABLE_URL_LOG=true       # Set to false to disable URL logging

      # Optional Telegram notification configuration
      - TELEGRAM_BOT_TOKEN=      # Your bot token from @BotFather
      - TELEGRAM_CHAT_ID=        # Your chat ID (can be channel, group, or user ID)

37
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/ElephantOrnery on 2025-01-28 15:09:00+00:00.


Hi All,

I have around 90 docker containers running on 5 servers and 3 NAS's. I have a complex network setup with multiple locations each having their own nginx server that is reachable over tailscale and one public facing one. I'm starting to get confused how certain network flows go. Especially http flows via tailscale since sometimes they go via a sidecar tailscale container and sometimes via a tailscale container + nginx. Also, with this amount of containers I forget now and then where a certain service is running.

Hence, I thought of making a network diagram. However, since things are quite dynamic I was thinking of making the network diagram update automatically. My thought is to use a bash script that does ssh calls to different servers to fetch the needed info and then use the python Diagrams package to generate the network diagram.

Especially gathering all info and translating it to the Diagrams format seems challenging.

This will take quite some work to setup, but it also feels like a thing more people in this subreddit would think of. So I was hoping to stand on the shoulders of other sado-masochastic selfhosters :-)

Any tips, advice, code to share, to help me along?

Much appreciated!

38
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Aggravating-Vehicle9 on 2025-01-28 10:12:43+00:00.


I'm a documentary filmmaker. I make videos about conspiracy theorists and related far right-wing organisations. My films make extensive use of media found on social media and video-sharing sites.

This is not just YouTube but also other unsavoury platforms like Rumble and BitChute. I track a lot of far-right wing, extremist and pseudo-legal groups by downloading their videos and then indexing them for future analyses. Al my videos are stored in a NAS (Asus Flashtor).

At the moment, I use some desktop software called 4KVideoDownloader+. It does a good job, but it runs on a desktop, so it has some major drawbacks: The most obvious being that it will not work if my laptop is not on and logged in.

Is there a fully server-hostable user interface for yt-dlp that allows me to subscribe to channels (e.g. on YT, BitChute, Rumble, TikTok), and just have the application download the files as soon as they arrive? I would like to save each subscription to a unique directory on the host.

Ideally, I'd like to be able to run this as a self-hosted, dockerized application directly on my NAS. It should run unattended, and I should be able to upgrade it just by doing a docker pull. Is there anything like what I'm after?

39
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/abite on 2025-01-28 14:48:17+00:00.


Welcome to DumbWare!

Boy do I have some news for you, we've been doing some cooking! After the amazing response on DumbDrop, DumbPad, DumbKan, and DumbDo we decided to start our official Dumb software organization DumbWare.io, where we will concentrate on developing stupid simple apps "that just work".

Along with this comes a fancy new website AND Discord where the community can come together to help each other out with your dumb problems and pitch new and amazing dumb ideas!

If you're pulling docker from abite3/dumbname please update it to dumbwareio/dumbname to continue receiving updates.

Now for the fun part, DumbBudget!

Stupid simple tracking for your stupid decisions

The stupid simple budget tracker, designed to help you keep track of all of your poor decisions. No muss, no fuss, just minus and plus.

Features:

  • Add/remove transactions
  • Filter by expense/income & date
  • Sort by date or $ amount
  • No database! Storage is stupid simple
  • Light/Dark mode
  • Export as CSV
  • Variable length pin authentication

And that's it!

We aren't trying to reinvent the wheel here, as a matter of fact, we're making it square.

Available on Dockerhub

We look forward to producing more simple apps that "just work".

Stay tuned as we have some exciting things coming and don't forget to join us on Discord!

40
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Shoddy-Addendum1069 on 2025-01-28 08:39:58+00:00.


Just giving a shoutout to Zoraxy. I've been wanting a great all-in-one replacement to handle reverse proxy (moving away from CloudFlare) which has local auth (as well as authelia integration).

I've also been able to whitelist public IPs for access (so certain containers only accessible from within my home network) with very little config.

Nice clean gui with plenty of direction, as well as editable json config files by text if you want to customise even more.

If you haven't well worth trying -

No templates in Unraid as yet, but great doco and easily deployed via portainer.

You can tell in its infancy still with some features not quite there yet, but overall a serious competitor to NPM or CloudFlare.

41
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/nocsi on 2025-01-28 08:21:27+00:00.


Just curious, how many of you write your own dockerfiles/know how to do so vs. just pulling down someone else's willy-nilly? My workflow is:

  • Git submodule of project alongside configuration files in a child dataset
  • Dockerfile based on project's dockerfile referencing that repo or my own custom one building it thats tailored for passing in environment vars and permissions
  • Docker Compose file with build step referencing dockerfile for that service
  • Keep my containers linked against my own registry
  • Update submodule as needed

If you can compile an open source project, you can write your own Dockerfile. Honestly many of you should be if you want to be able to load drivers like intel QAT or other accelerations. I get the sense that people on here are perfectly fine just pulling down whatever, but maybe a side question -- how many of you compile the projects you use?

42
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/Psychological_Ear393 on 2025-01-28 07:17:26+00:00.

43
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/xconspirisist on 2025-01-27 22:37:50+00:00.


Just curious; what have you not found a self hosted solution for? What app do you wish you had that doesn't seem to exist?

I ask because I'm very eager to develop something new 😜 I'm working on lots of things, but many if them don't have users yet. After I created OliveTin (which has lots and lots of users), it's hard to stay motivated on new projects of you're not sure if they're going to be used.

44
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/thehelpfulidiot on 2025-01-27 22:30:45+00:00.


Ghostboard 2.0: Dynamically Create Unlimited Real-Time Text Boards! (Self-Hosted, Open Source, Dockerized)

Hey everyone,

I’m thrilled to share the latest update to Ghostboard, my open-source project for real-time synchronized text sharing! Since the initial release, I’ve added a game-changing new feature:

🎉 Dynamic Board Creation

You can now create unlimited boards simply by appending a subdirectory to the URL. For example:

No extra configuration required—Ghostboard handles it all dynamically. Perfect for sharing different notes or projects with multiple groups!

https://yourdomain.com/ of course still works as before. This added feature did not change any previous functionality.

Key Features

Web-Based Real-Time Syncing

Access a synchronized text field that updates instantly across all connected devices.

Command-Line Client

Interact with the text programmatically using the CLI—perfect for automation or quick edits.

Dockerized for Easy Deployment

Run the server and client in seconds using prebuilt Docker images:

Dynamic Board Creation

Simply append any subdirectory (e.g., /myboard) to the URL to create a new board instantly!

Reverse Proxy Support

Works seamlessly with reverse proxies like Nginx to enable HTTPS and proper routing.

Lightweight & Private

No databases, no heavy dependencies—just WebSockets, a static webpage, and privacy you control.

How to Try It

Run the Server

docker run --rm -p 8080:8080 -p 8765:8765 thehelpfulidiot/ghostboard-server

Then visit http://:8080 in your browser.

Use the Client

Retrieve text:

docker run --rm thehelpfulidiot/ghostboard-client 

Update text:

docker run --rm thehelpfulidiot/ghostboard-client  "Your text here"

Repo and Links

🔗 GitHub: Ghostboard Repository

🐳 DockerHub Server: Ghostboard Server Image

🐳 DockerHub Client: Ghostboard Client Image

Why I Built It

I often needed a simple way to share text between devices—clipboard syncing, quick notes, brainstorming—without relying on external services. Ghostboard was born to provide speed, privacy, and ease of use for self-hosters.

What Do You Think?

I’d love to hear your feedback! How would you use this? What features would make it even better? I want to keep it simple but am open to adding more if possible.

Let me know your thoughts, and thanks for checking out Ghostboard! 😊

45
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/WesternPerspective53 on 2025-01-27 22:10:26+00:00.


Hey fellows,

around a month ago I posted about a tool I built : Compose Craft

It help you turn any docker-compose in an interactive diagram !

It's fully self-hostable, you can share your docker-composes, edit them, and there is also a docker-compose file library.

I just released the 1.0.0 and I wonder about features I should add on the roadmap ?

Here is the cloud version link (which is free to use) :

Here is the GitHub :

I hope it gonna be useful for you guys !

46
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/ShinyAnkleBalls on 2025-01-27 22:08:37+00:00.


Hi there, I keep seeing mor and more posts about running Deepseek R1 locally. Some claim you can do it using Ollama with a few GB of ram.

You can't run THE Deepseek R1 with Ollama. If you install Ollama and select Deepseek R1, what you are getting and using are the much much smaller and much much less performant distilled models. They are effectively fine tunes of different existing models (Qwen2.5, Llama, etc.) using data generated using Deepseek R1. They are great, but not THE R1 OpenAI is scared of.

I don't know why Ollama decides to call these models Deepseek R1, but it's problematic. Running the actual Deepseek R1 in q4 requires more than 400GB of VRAM or RAM depending on how long are are willing to sit there waiting for an answer...

47
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/xXSorakaXx on 2025-01-27 21:23:21+00:00.


Hi Reddit,

Just wanted to share how I started my selfhost/homelab journey. Hopefully this post serves as inspiration for your own homelab or maybe you even have suggestions how I can improve my own. I’ll also include some of the lessons I learned so hopefully you won’t have to make the same mistakes. I work in a hospital as cloud engineer, so I have quite a bit of experience with IT and Linux systems, which really helped me get everything up and running. 

I started a few years ago with a Synology NAS (DS220+), with two 4TB HDDs in RAID1 configuration and upgraded RAM (+16 GB). I could install Docker Compose on it and start playing around with containers. In a short amount of time, I gathered quite a long list of containers (~35). Unfortunately, I did notice quite quickly that it was slow, this mainly had to do with the disk speed. I didn’t want to already replace my disks since they were brand new, my NAS did however come with 2 USB ports, so I bought a fast USB stick for the databases. It is not the most reliable type of storage, so I did try to make sure to at least regularly take backups. I must say that it worked quite well, there were some cases where the stick entered read-only mode which were easily fixed by re-mounting it. Ain’t stupid if it works right?

Quite recently, friends also started getting interested in my setup since streaming services kept increasing their prices and cracking down on account sharing. I also started running into the limitations of my current setup, I mainly didn’t have enough storage. My NAS only has two bays, so if I wanted to upgrade my storage, it would mean replacing disks, while they were still working fine. I also kinda wanted to get more performance and started looking into building my own homeserver.

For this I watched a bunch of videos on recommendations from tech youtubers. I came to the following build, which was around 1.100 EUR in total:

  • Fractal Design Node 304 Mini ITX Tower Case
  • Gigabyte B550I AORUS PRO AX Mini ITX AM4 Motherboard
  • AMD Ryzen 5 5600G
  • Kingston NV3 1 TB M.2 NVME SSD (For OS, databases and application data)
  • 2x Crucial CT16G4DFRA32A 16GB DDR4 RAM
  • 2x Seagate Ironwolf Pro 16 TB (Factory refurbished)
  • GIGABYTE UD750GM 750W PSU
  • Intel Arc A380 ELF 6GB

I was happy with the refurbished deal that I found for the two HDDs, which only cost me 178 EUR, which normally cost around 320 EUR. I did look at the SMART values and runtime metrics. These were still very low, so it looks like I got new drives, or they were able to reset this at the factory?

You might ask, why did you get a CPU with integrated graphics AND a dedicated graphics card? Well, I started reading up on how to do the hardware transcoding, and it turns out that AMD doesn’t work well, it was highly discouraged by the Jellyfin docs, so I decided to get a cheap Intel card. I’ve heard it was quite energy efficient; it supports the AV1 which might be nice in the future. In hindsight it might also have been better to get a PSU with lower capacity since these are a lot more efficient.

It took me about a day to assemble the server, I installed Ubuntu Server 24.04 LTS and started installing all packages I needed. For the HDD filesystem, I used MergerFS to stitch together the HDDs into one large volume. I mostly use this for storing large media files, so I don’t really care about redundancy. It sucks if one of the drives dies, but then I’ll just have to download the media again which will take a few days.

Recently I also implemented backup strategy for my important data (appdata/databases), since I’m not using RAID to store data across multiple disks. For this I’m using the docker-volume-backup container. Every week it runs the backup process; it stops a bunch of containers during the backup, makes a copy and compresses the data. After that it stores the archive in a specified local location and uploads a copy via SMTP to my NAS.

Some useful services that I’m selfhosting and would like to highlight:

  • Nginx Proxy Manager: Makes it very easy to run a reverse proxy, also handles all the SSL certificates for you
  • Jellyfin: Media streaming system, open source and free.
  • Spotweb: A free decentralized usenet indexer
  • Tdarr: Especially nice if you don’t have a beefy server to transcode media while streaming. I used this on my Synology to get all my media into the H.264 format which is very well supported so no transcoding would be necessary. It can offload jobs to nodes (my gaming PC with an RTX 2070 GPU) for transcoding. Currently using a flow that can transcode all media into H.265 (better compression) using both my Intel GPU and Nvidia GPU, using the One Flow to Rule Them All by u/SamSausages.
  • AdGuardHome: Self hosted DNS server, which blocks ads and trackers. The amount of traffic that is blocked is insane!
  • Dockergc / watchtower: Deleting dangling containers and automatically installing updated containers
  • Ddclient: I have a dynamic IP, so this will automatically update my DNS records
  • Crowdsec: Crowdsourced security, bans abusive IPs that have been attacking other severs on my server too
  • Redlib: Private Reddit frontend which I now use almost exclusively. In the past I was an Apollo user. I didn’t like the Reddit app with all of its tracking so switched over to Redlib after they killed third party apps. Only disadvantage is that you cannot post using Redlib.
  • Firefly: Great for tracking how your finances are doing. I’m also using the auto import function which can pull my banking data through GoCardless. I have a bunch of classification rules setup which give me great insights on how I’m spending my money, without much effort.
  • Karaoke Eternal: This is a fun project that is quite unknown, you can use it to host a karaoke server. Had tons of fun with friends with this service.

So far, I’m very pleased with the performance. It now takes less than 10 seconds to start my docker compose stack. On my Synology this would take more than 5 minutes. It can also handle multiple media streaming sessions with ease and doesn’t break the bank in terms of power consumption. I didn’t notice a significant increase in monthly power bill. I unfortunately don’t have the tools to measure how much power it consumes.

Work in progress... Was still missing the drives and the GPU. Getting all the cables in was quite difficult with such a small case. Getting it tidy was impossible.

Finally done...

48
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/muhepd on 2025-01-27 17:42:40+00:00.


Hello,

Just like the title says, do you know a tool, self-hosted, that can monitor a Youtube playlist/channel and downlad the latest video uploaded to it?

Thanks in advance.

49
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/LordLuz on 2025-01-27 13:34:15+00:00.


I work in a company with about 5,000 employees. I want to create a library or an information-sharing center that can primarily store eBooks, journals, and standards, but can also store videos, photos, etc.

I have some requirements for the library, such as:

  1. It must have some form of user permission system to lock certain users out of specific libraries, or vice versa, locking certain libraries to specific user groups.
  2. Every user must be able to contribute to the library.
  3. It must be able to perform OCR (Optical Character Recognition).
  4. Optional: It should be able to store videos and other media types.

In my own research, I couldn't find any platforms that can do all of these reliably. The only option I found that somewhat matches my vision is Calibre – Calibre Web. Is there any platform or workflow you can suggest? I would appreciate any help.

50
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/selfhosted by /u/esiy0676 on 2025-01-27 11:12:16+00:00.


I found a 6y old post here on the same, but lots of mentioned options seem unmaintained by now, e.g.:

Anyone has any ideas for self-hosted pastebin-like solution which allows:

  • private only posts; as well as
  • public access; and
  • full-text search; and
  • NOT using PHP in the stack (yes, hard requirement)?

EDIT: Apparently it's not possible to answer openly in the comments why PHP is a no-go for some. Of course you can mention any solution for the benefit of others, it's just my requirement and minus votes alone will not really change it. Thanks for all the mentions to everyone!

view more: ‹ prev next ›