Skip to main contentdfsdf

Bruno Poellhuber
  • ’ve been running local LLMs for a while now. If you’re anything like me, you know the feeling — you download a new model, hit “load,” and watch your machine choke because it doesn’t have enough RAM. It’s frustrating, especially when you know you have a more powerful machine sitting somewhere else.

    That’s exactly the situation I was in. And then I got early access to LM Studio Link.

    • I have two Apple Silicon machines at home:

      • Mac Mini M4–16GB RAM. My daily driver. Compact, quiet, always on.
      • MacBook Pro M4 Max — 64GB RAM. The beast. But it’s not always at my desk.

      The Mini handles everyday tasks beautifully. But when it comes to running serious models like OpenAI’s GPT-OSS 20B, Qwen 3.5 35B, or Llama 3 70B? Not a chance. These models need far more memory than 16GB can offer.

10 more annotations...

Erkan Saka

Hey Grok bu doğru mu?

Prof. Dr. Erkan Saka’ya göre, en gelişmiş büyük dil modellerinin (LLM) bile kirlilikten zehirlenmesini ve yanılmasını medyanın yapısal bir zafiyeti olarak değerlendirmek gerekiyor. Yaşanan durum üç katmanlı bir zafiyete işaret ediyor. Platformlar insan moderasyonunu azaltıp denetimi yapay zekâya bırakırken, LLM’ler eğitimlerindeki örüntü yakalama mantığı nedeniyle bilgi kirliliğini gerçek sanabiliyor

Shared by Erkan Saka, 1 save total

Bruno Poellhuber
  • We have identified industrial-scale campaigns by three AI laboratories—DeepSeek, Moonshot, and MiniMax—to illicitly extract Claude’s capabilities to improve their own models. These labs generated over 16 million exchanges with Claude through approximately 24,000 fraudulent accounts, in violation of our terms of service and regional access restrictions.
  • These labs used a technique called “distillation,” which involves training a less capable model on the outputs of a stronger one. Distillation is a widely used and legitimate training method. For example, frontier AI labs routinely distill their own models to create smaller, cheaper versions for their customers. But distillation can also be used for illicit purposes: competitors can use it to acquire powerful capabilities from other labs in a fraction of the time, and at a fraction of the cost, that it would take to develop them independently.

6 more annotations...

Show more items

Highlighter, Sticky notes, Tagging, Groups and Network: integrated suite dramatically boosting research productivity. Learn more »

Join Diigo