llama

joined 2 months ago
[–] [email protected] 4 points 1 month ago (1 children)

I don't know that one. Is it FOSS?

[–] [email protected] 5 points 1 month ago* (last edited 1 month ago)

You are completely right. That was worded poorly and a few users have thankfully pointed that out. The answer, for most people, is yes. But that depends entirely on your threat model.

The traffic to your Snowflake proxy isn't necessarily from people in 'adversarial countries'. A Snowflake proxy is a type of bridge, so just about anyone can use it. You can use a Snowflake bridge, if you want. However, it is strongly encouraged to save bridges (including Snowflakes) to people who need them.

So, for most people, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but, to them, it will look like you're calling someone on, say, Zoom since it uses WebRTC technology. They can't see the data, though since everything is encrypted (check the Snowflake docs and Tor Brower's for further reference). You probably won't get in any trouble for that.

Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement, but none of them have been arrested or prosecuted so far.

If you know of any, let me know.

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago)

I have not used AI to write the post. I used Claude to refine it because English is not my first language. If there are any errors, that is my bad. Please point them out as you did so I can fix them.

This has several errors including the fact that running the proxy exposes your IP address.

Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

For further clarification:

The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person's IP address is – the one that is connecting to your bridge.

However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what's actually going on.

To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven't any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

For clarification, you're not acting as an exit node if you're running a snowflake proxy. Please, check Tor's documentation and Snowflake's documentation.

[–] [email protected] 8 points 1 month ago* (last edited 1 month ago)

Thank you for pointing that out. That was worded pretty badly. I corrected it in the post.

For further clarification:

The person who is connecting to your Snowflake bridge is connecting to it in a p2p like connection. So, the person does know what your IP address is, and your ISP also knows that the person's IP address is – the one that is connecting to your bridge.

However, to both of your ISPs, it will look like both of you are using some kind of video conferencing software, such as Zoom due to Snowflake using WebRTC technology, making your traffic inconspicuous and obfuscating to both of your ISPs what's actually going on.

To most people, that is not something of concern. But, ultimately, that comes down to your threat model. Historically, there haven't any cases of people running bridges or entry and middle relays and getting in trouble with law enforcement.

So, will you get in any trouble for running a Snowflake bridge? The answer is quite probably no.

For clarification, you're not acting as an exit node if you're running a snowflake proxy. Please, check Tor's documentation and Snowflake's documentation.

[–] [email protected] 3 points 1 month ago (6 children)

Not true. If you load a model that is below your phone's hardware capabilities it simply won't open. Stop spreading fud.

[–] [email protected] 1 points 1 month ago

Though apparently I didn't need step 6 as it started running after I downloaded it

Hahahha. It really is a little redundant, now that you mention it. I'll remove it from the post. Thank you!

Good fun. Got me interested in running local LLM for the first time.

I'm very happy to hear my post motivated you to run an LLM locally for the first time! Did you manage to run any other models? How was your experience? Let us know!

What type of performance increase should I expect when I spin this up on my 3070 ti?

That really depends on the model, to be completely honest. Make sure to check the model requirements. For llama3.2:2b you can expect a significant performance increase, at least.

[–] [email protected] 3 points 1 month ago

Of course! I run several snowflake proxies across my devices and their browsers.

[–] [email protected] 25 points 1 month ago (1 children)

I didn't use an LLM to make the post. I did, however, use Claude to make it clearer since English is not my first language. I hope that answers your question.

[–] [email protected] 2 points 1 month ago

I have tried on more or less 5 spare phones. None of them have less than 4 GB of RAM, however.

 

cross-posted from: https://lemmy.dbzer0.com/post/36880616

Help Combat Internet Censorship by Running a Snowflake Proxy (Browser or Android)

Internet censorship remains a critical threat to free expression and access to information worldwide. In regions like Iran, Russia, and Belarus, journalists, activists, and ordinary citizens face severe restrictions when trying to communicate or access uncensored news. You can support their efforts by operating a Snowflake proxy—a simple, low-impact way to contribute to a freer internet. No technical expertise is required. Here’s how it works:


What Is Snowflake?

Snowflake is a privacy tool integrated with the Tor network. By running a Snowflake proxy, you temporarily route internet traffic for users in censored regions, allowing them to bypass government or institutional blocks. Unlike traditional Tor relays, Snowflake requires minimal bandwidth, no configuration, and no ongoing maintenance. Your device acts as a temporary bridge, not a permanent node, ensuring both safety and ease of use.


Is This Safe for Me?

Short answer: Yes.

Long answer: pobably. Here is why:

  • Your IP address is not exposed to the websites they access. So, you don't have to worry about what they are doing either. You are not an exit node.
  • No activity logs. Snowflake cannot monitor or record what users do through your connection. The only stored information is how many people have connected to your bridge. Check docs for further info on this.
  • Low resource usage. The data consumed is comparable to background app activity—far less than streaming video or music.
  • No direct access to your system
  • No storage of sensitive data. Snowflake proxies do not store any sensitive data, such as IP addresses or browsing history, on your system.
  • Encrypted communication. All communication between the Snowflake proxy and the Tor network is encrypted, making it difficult for attackers to intercept or manipulate data.

You are not hosting a VPN or a full Tor relay. Your role is limited to facilitating encrypted connections, similar to relaying a sealed envelope.

Your IP address is exposed to the user (in a P2P-like connection). Be mindful that your ISP could also potentially see the WebRTC traffic and the connections being made to it (but not the contents), so be mindful of your threat model.

For most users, it is generally safe to run Snowflake proxies. Theoretically, your ISP will be able to know that there are connections being made there, but to them it will look like you're calling someone on, say, Zoom.

Historically, as far as we know, there haven't been any cases of people getting in legal trouble for running entry relays, middle relays, or bridges. There have a been a few cases of people running exit nodes and getting in trouble with law enforcement agencies, but none of them have been arrested or prosecuted as far as I know it. If you are aware of any cases, let me know so I can update this post.

Do not hesitate to check Snowflake's official documentation for further reference and to make informed decisions.


How to Set Up a Snowflake Proxy

Option 1: Browser Extension (Brave, Firefox, or Chrome)

  1. Install the Snowflake extension.
  2. Click the Snowflake icon in your browser toolbar and toggle "Enable Snowflake."
  3. Keep the browser open. That’s all.

Note: Brave users can enable Snowflake directly in settings. Navigate to brave://settings/privacy and activate the option under "Privacy and security."


Option 2: Android Devices via Orbot

  1. Download Orbot (Tor’s official Android app).
  2. Open the app’s menu, select "Snowflake Proxy," and toggle it on.
  3. For continuous operation, keep your device charged and connected to Wi-Fi.

Your device will now contribute as a proxy whenever the app is active.


Addressing Common Concerns

  • Battery drain: Negligible. Snowflake consumes fewer resources than typical social media or messaging apps.
  • Data usage: Most users report under 1 GB per month. Adjust data limits in Orbot’s settings or restrict operation to Wi-Fi if necessary.

Why Your Participation Matters

Censorship mechanisms grow more sophisticated every year, but tools like Snowflake empower ordinary users to counteract them. Each proxy strengthens the Tor network’s resilience, making it harder for authoritarian regimes to isolate their populations. By donating a small amount of bandwidth, you provide someone with a critical connection to uncensored information, education, and global dialogue.

Recent surges in demand—particularly in Russia—highlight the urgent need for more proxies. Your contribution, however small, has an impact.

By participating, you become part of a global effort to defend digital rights and counter censorship. Please, also be mindful of your threat mode and understand the potential risks (though very little for most people). Check Snowflake's official documentation for further reference and don't make any decisions based on this post before taking your time to read through it.

Please share this post to raise awareness. The more proxies, the stronger the network.

– llama

[–] [email protected] 2 points 1 month ago

Great explanation, Max!

[–] [email protected] 2 points 1 month ago (3 children)

I would argue there would not be any noticeable differences.

34
submitted 1 month ago* (last edited 1 month ago) by [email protected] to c/[email protected]
 

Hello, everyone! I wanted to share my experience of successfully running LLaMA on an Android device. The model that performed the best for me was llama3.2:1b on a mid-range phone with around 8 GB of RAM. I was also able to get it up and running on a lower-end phone with 4 GB RAM. However, I also tested several other models that worked quite well, including qwen2.5:0.5b , qwen2.5:1.5b , qwen2.5:3b , smallthinker , tinyllama , deepseek-r1:1.5b , and gemma2:2b. I hope this helps anyone looking to experiment with these models on mobile devices!


Step 1: Install Termux

  1. Download and install Termux from the Google Play Store or F-Droid

Step 2: Set Up proot-distro and Install Debian

  1. Open Termux and update the package list:

    pkg update && pkg upgrade
    
  2. Install proot-distro

    pkg install proot-distro
    
  3. Install Debian using proot-distro:

    proot-distro install debian
    
  4. Log in to the Debian environment:

    proot-distro login debian
    

    You will need to log-in every time you want to run Ollama. You will need to repeat this step and all the steps below every time you want to run a model (excluding step 3 and the first half of step 4).


Step 3: Install Dependencies

  1. Update the package list in Debian:

    apt update && apt upgrade
    
  2. Install curl:

    apt install curl
    

Step 4: Install Ollama

  1. Run the following command to download and install Ollama:

    curl -fsSL https://ollama.com/install.sh | sh
    
  2. Start the Ollama server:

    ollama serve &
    

    After you run this command, do ctrl + c and the server will continue to run in the background.


Step 5: Download and run the Llama3.2:1B Model

  1. Use the following command to download the Llama3.2:1B model:
    ollama run llama3.2:1b
    
    This step fetches and runs the lightweight 1-billion-parameter version of the Llama 3.2 model .

Running LLaMA and other similar models on Android devices is definitely achievable, even with mid-range hardware. The performance varies depending on the model size and your device's specifications, but with some experimentation, you can find a setup that works well for your needs. I’ll make sure to keep this post updated if there are any new developments or additional tips that could help improve the experience. If you have any questions or suggestions, feel free to share them below!

– llama

 

I created this account two days ago, but one of my posts ended up in the (metaphorical) hands of an AI powered search engine that has scraping capabilities. What do you guys think about this? How do you feel about your posts/content getting scraped off of the web and potentially being used by AI models and/or AI powered tools? Curious to hear your experiences and thoughts on this.


#Prompt Update

The prompt was something like, What do you know about the user [email protected] on Lemmy? What can you tell me about his interests?" Initially, it generated a lot of fabricated information, but it would still include one or two accurate details. When I ran the test again, the response was much more accurate compared to the first attempt. It seems that as my account became more established, it became easier for the crawlers to find relevant information.

It even talked about this very post on item 3 and on the second bullet point of the "Notable Posts" section.

For more information, check this comment.


Edit¹: This is Perplexity. Perplexity AI employs data scraping techniques to gather information from various online sources, which it then utilizes to feed its large language models (LLMs) for generating responses to user queries. The scraping process involves automated crawlers that index and extract content from websites, including articles, summaries, and other relevant data. It is an advanced conversational search engine that enhances the research experience by providing concise, sourced answers to user queries. It operates by leveraging AI language models, such as GPT-4, to analyze information from various sources on the web. (12/28/2024)

Edit²: One could argue that data scraping by services like Perplexity may raise privacy concerns because it collects and processes vast amounts of online information without explicit user consent, potentially including personal data, comments, or content that individuals may have posted without expecting it to be aggregated and/or analyzed by AI systems. One could also argue that this indiscriminate collection raise questions about data ownership, proper attribution, and the right to control how one's digital footprint is used in training AI models. (12/28/2024)

Edit³: I added the second image to the post and its description. (12/29/2024).

14
submitted 2 months ago* (last edited 2 months ago) by [email protected] to c/[email protected]
 

I created this account two days ago, but one of my posts ended up in the (metaphorical) hands of an AI powered search engine that has scraping capabilities. What do you guys think about this? How do you feel about your posts/content getting scraped off of the web and potentially being used by AI models and/or AI powered tools? Curious to hear your experiences and thoughts on this.


#Prompt Update

The prompt was something like, What do you know about the user [email protected] on Lemmy? What can you tell me about his interests?" Initially, it generated a lot of fabricated information, but it would still include one or two accurate details. When I ran the test again, the response was much more accurate compared to the first attempt. It seems that as my account became more established, it became easier for the crawlers to find relevant information.

It even talked about this very post on item 3 and on the second bullet point of the "Notable Posts" section.

For more information, check this comment.


Edit¹: This is Perplexity. Perplexity AI employs data scraping techniques to gather information from various online sources, which it then utilizes to feed its large language models (LLMs) for generating responses to user queries. The scraping process involves automated crawlers that index and extract content from websites, including articles, summaries, and other relevant data. It is an advanced conversational search engine that enhances the research experience by providing concise, sourced answers to user queries. It operates by leveraging AI language models, such as GPT-4, to analyze information from various sources on the web. (12/28/2024)

Edit²: One could argue that data scraping by services like Perplexity may raise privacy concerns because it collects and processes vast amounts of online information without explicit user consent, potentially including personal data, comments, or content that individuals may have posted without expecting it to be aggregated and/or analyzed by AI systems. One could also argue that this indiscriminate collection raise questions about data ownership, proper attribution, and the right to control how one's digital footprint is used in training AI models. (12/28/2024)

Edit³: I added the second image to the post and its description. (12/29/2024).

 

I use both Threads and Mastodon. However, I realized that sometimes (public) profiles on Threads don't show up on Mastodon and vice versa. I also realized that most comments made on Threads posts don't show up on Mastodon – that is, if the posts appear on Mastodon at all. The same is true the other way around. Why does this happen?

55
submitted 2 months ago* (last edited 2 months ago) by [email protected] to c/[email protected]
 

I've been using Lemmy since the Reddit exodus. I haven't looked back since, but I miss a lot of mental health communities that I haven't been able to find replacements for here on Lemmy. Does anyone know any cool mental health communities that are somewhat active?

view more: ‹ prev next ›