Replies (53)

Details, please! Is this PWA distribution via nostr events? Or is the PWA signature on nostr? What is signed there? I'm very eager to learn for WalletScrutiny where I had PWAs kind of dismissed as impossible to get secure against hacked servers for example.
On Google phones, the OS won't let you down-grade apps or upgrade them to a version that's signed by a different key. I want this for PWAs. Of course, as admin I want to be able to override it but it would be great to have such a lock-in that no random hacker on a webserver could circumvent.
DUDE ๐Ÿคฏ Nostr is one of those technologies where every time I see it applied to other stuff, the application is both mind boggling and so obvious. Amazing.
This introduces interesting attack vectors that will need to be mitigated. E.g. When a person gets hacked, then that could introduce many "scam" apps to people that followed the person.
Gerardo's avatar
Gerardo 1 year ago
Iโ€™m surprised something like this hasnโ€™t been done before its genius ๐Ÿ‘€
You can mitigate this with external code verifiers: - that get payed to do that ๐Ÿ‘‰ Verifiers DVM's - that pay to do that ๐Ÿ‘‰ early beta access (and then of course we need key rotation asap for way more reasons than just this)
โ€œtrustโ€ is delegated from the AppStore bureaucrats who let scam bitcoin wallets in anyway, to your WOT. Would you expect all of your WOT to get hacked at the same time @nout ?
I'm assuming that not all people in your web of trust need to verify the app. So there will be some threshold. E.g. "at least 3 people in your WOT with score over 5 verified app ABC". And then yeah, 2 people could be hacked, especially over longer term. There are definitely ways to mitigate these issues...
The framework that I like to think in is "desirables vs undesirables": - If the user action is desirable, the user should be rewarded sats. - If the users gets value from the content, they should be nudged to voluntarily pay sats for it. - If the action is undesirable it should cost sats.
Default avatar
nobody 1 year ago
Great idea. I just about had my whole stack siphoned last weekend over a fake version of sparrow that was listed as a mobile app.
Maybe? Are you thinking that when a new person verifies the app, the verification has to get interactively signed (with multisig) by other people? Or are you thinking co-signing with the store or some verification system paid for co-verifying?
With the likelihood of multiple secret key being compromised, a release has to get X number of signatures before considered verified by clients and thus downloadable. Whether the signatures are independent or m-of-n multi-sig is something to explore. In the case of paying for co-verifying I think it will have the wrong incentives if an invalid verification isnโ€™t penalized somehow and the affected users reimbursed?
Even if someone you trust gets compromised in that way the WoT of the scam would be very low. Thatโ€™s the magic of WoT and how humans already behave
This product is using NIP-94 (and another NIP coming up soon) for the distribution and verification of artifacts Android enforces TOFU and pinning at the OS level (APK are signed), PWAs have none of this. Installing them via zap.store can emulate these features (same signature for updates, prevent downgrading)
TOFU ๐Ÿ˜ I was not familiar with "trust on first use" being called that. I'm very excited about app distribution via nostr. This might be the killer app but then again, the social graph might on its own be the killer feature many tiny baby killer apps build on. Lets make the App Store obsolete! (followed)
I did not get involved when drafting BIP94 but now I wonder why URL is not optional when it might be used for torrents and why there is no filename. The fallback is also not well specked. Is fallback allowed to occur multiple times or may it be `[fallback, fallback1, fallback2, ...]`
Could you please explain this to me like I'm completely stupid? ๐Ÿ˜ƒ Let's say there are 5 users that already have solid WOT score. Now 2 of them get hacked and hacker installs and "verifies" an AppX. Unless the other users get somehow notified of the hack, they will see the AppX as "verified" and may potentially install the app too, no? What would prevent it? Some lower bound limits on how many people are needed to "verify"?
โ†‘