me: (having chronic executive dysfunction since childhood. asking for if 36mg concerta is enough for me)
my psychiatrist: your screen time must be at maximum 4 hours for only coding, research... and 0 hours for playing games, social media... you must cut all sugar even fruits. and i want you to read Jules Payot's The Education of the Will book and take notes
me: O.o
hello
npub1tnda...mv80
I am new to Nostr and I am from Türkiye. So if my English is bad, I am still learning.
- 21 y/o
- NixOS
- Flatpak
- GNOME
- Podman
- Mullvad
- JavaScript
- Roblox (yea... can't chat of course)
vibecoding is ok if you are rigorously auditing what your AI is doing. those things are useful but really hallucinating so much in sneaky ways. you have to smell line by line your AI generated code like a pervert to ensure its safe, secure and correct
i also love the concept of property-based testing. i always hated unit tests. property-based testing is like the practical version of formal verification
#programming #software #vibecoding
I just created another PR for Flatpak to mount "/dev/ntsync" after months... because nobody cared. I don't know man. NTSync is fucking cool for WINE. I guess they just didn't priotrize it.
This was the first PR:
This is the new PR:
I don't even know C. I just hope they accept this time. bruh
#linux #flatpak #wine
GitHub
flatpak-run: Mount /dev/ntsync with --device=ntsync by kolon0 · Pull Request #6353 · flatpak/flatpak
NTSync was already released in the Linux 6.14 kernel version. The WINE 10.16 release notes also state that NTSync support has been added. During th...
GitHub
flatpak-run: Mount /dev/ntsync with --allow=multiarch by kolon0 · Pull Request #6542 · flatpak/flatpak
This PR bind-mounts /dev/ntsync into the sandbox if the host kernel supports it and the --allow=multiarch permission is granted.
Why is this needed...
For years, i was always thinking about how to make machine learning models as much trustworthy as expert humans. It was specifically for things like text/image moderation. But i just always procrastinated to learn things like linear algebra, probability/statistics, calculus... I just procrastinate.
I found out this course:
Now I am begging myself: please don't procrastinate, please don't procrastinate, please don't procrastinate... bruh
I heard things like conformal prediction and calibrating model probabilities many times. None of them was enough for me at all. Now my obsession is about: how to detect near out-of-distributions... and i don't know the answer that satisfies me yet...
#ml #machine_learning

Mathematics for Machine Learning and Data Science Specialization
A beginner-friendly specialization where you'll master the fundamental mathematics toolkit of machine learning: calculus, linear algebra, statistic...
I just started to use Amber instead of directly giving my private key to Amethyst... wow... its like... wow
It even has offline version. I love it! Its my guardian angel. I can accept or deny whatever i want for each client. I am very impressed in Nostr and its ecosystem. I am gonna start making donations. I just need to figure out that... bitcoin wallet thing? bruh. i don't even know why bitcoin though... but anyways... as far as i can make donations its not important to me
I am gonna use whatever client i want with no trust issues. Thank you everyone who made this possible! I am also gonna introduce Nostr to my turkish friends
Disclaimer: I am not cryptographer.
I really don't know why Nostr's NIP-13[1] is using the leading zeroes approach for Proof-of-Work. Can't GPUs just eat this approach of PoW very easily? I thought RSA-based time-lock puzzle[2] would be the correct approach? There are also Wesolowski and Pietrzak VDFs (Verifiable Delay Functions). But we don't need them here for relays, right? Because only relays need to generate and verify PoWs? And RSA-based time-lock puzzle's implementation is much easier. Am i wrong?
[1]:
[2]: https://people.csail.mit.edu/rivest/pubs/RSW96.pdf
NIP13 - NIP-13 - Proof of Work
Read more about the NIP13 on {{appUrl}} - NIP-13 - Proof of Work