this post was submitted on 23 Jan 2021
53 points (92.1% liked)

Privacy

33499 readers
273 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

Serious question, why do we give so much power to these tech firms such as Google, when really the solution is decentralization. We have some solutions for money with bitcoin, but there still doesn't seem to be a viable alternative with a search engine that is decentralized?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 6 points 4 years ago (1 children)

pros:

  • powerful customization. intuitive and fine grained.
  • decentralized. huge power but not perfected.
  • crawler is simple and customizable. fairly raw.
  • solr/lucene benefits included from day 1.
  • website that covers the basics. and forum for help.

cons:

  • frequently buggy and feels heavy
  • documentation. very little documentation. many good features that are difficult to discover or use.
  • consumes alot of storage. not dividable.
  • setup takes time
  • no foolproof simplicity. and less adoption due to setup time+difficulty.
  • not recognized as important and not very popular. which is a self-fulfilling prophecy.
  • not too many maintainers/contributors. slow development.

alternatives:
none
there are many alternatives for more efficient small personalized search engines with medium startup time and skill level.
a crawler can be more customizable if you script it yourself. (for yacy or whatever else)

[โ€“] [email protected] 2 points 4 years ago

To run YaCy less demonic (low resources device)

  • sh startYACY.sh -d

Controlling cpu usage :

At the /PerformanceQueues_p.html page, you can limit some threads (light processes inside the JVM) maximum numbers (in the Thread Pool Settings table). When you are crawling, you can also control the Speed / PPM (Pages Per Minute) in the /Crawler_p.html page.