#70 closed enhancement (wontfix)
idea: integrate captcha into some furls
Reported by: | samuelstoller | Owned by: | nobody |
---|---|---|---|
Priority: | minor | Milestone: | undecided |
Component: | code | Version: | 0.4.0 |
Keywords: | uri captcha | Cc: | |
Launchpad Bug: |
Description (last modified by amontero)
Often when I share data with other people on the internet I do the following:
- Put the data in a directory on my web server.
- Add some .htaccess fu to my webserver to get basic auth protection of the data
- Mail the people a link to the data, I also put on a line following the URL the username and password necessary to access the data.
This way if a search engine grabs the link, the data won't be accessible to the whole internet. (A human needs to read the email message and locate the http auth data necessary to follow the link.)
Furls could emulate this captcha like behavior:
- have an 'incomplete' furl type; missing some critical bytes, but complete enough that the client can figure out whats going on to prompt the user for a key.
- the furl would be completed with a 'key' which a user can pass out-of-link.
Change History (8)
comment:1 Changed at 2007-07-01T03:22:59Z by zooko
- Description modified (diff)
comment:2 Changed at 2007-07-02T19:01:33Z by warner
hrm. let's clear up some terminology first:
"furls": these are identifiers used by Foolscap to point at instances: code objects with methods on them, meant to be invoked by other code
"uris": identifiers used by Tahoe to point at files or directories: meant to be used by either humans or code
Is this a tahoe issue or a foolscap issue? I'll assume you mean to say that uris could have some only-for-use-by-human properties. Furls are meant for use by programs.
The uri already has security properties built in: possession of the uri is both necessary and sufficient to access the file. If you don't want a search engine to make the file available, don't let a search engine see the uri.
Now, that's a separate question from making for-human-eyes-only URIs. At the moment, since tahoe is not particularly widespread, tahoe URIs *are* for-human-eyes-only, since it would take a human to copy the results out of a web page and paste it into a tahoe client. But even if google got excited about crawling into a tahoe mesh to index the contents of all the tahoe URIs it discovered, it would probably be sufficient to remove the 'URI:' suffixprefix from a published tahoe uri: that would be enough to cause a tahoe client to reject the string as invalid.
Or, you could split the URI in half and send each side to someone via a separate channel.
What is the goal? Is it to hide data from google? Should the URI be sufficient for a human to retrieve the file, but not a computer?
comment:3 Changed at 2007-07-02T19:20:53Z by warner
- Milestone set to undecided
- Priority changed from major to minor
comment:4 Changed at 2007-07-12T19:01:27Z by warner
- Component changed from code to unknown
- Owner changed from somebody to nobody
comment:5 Changed at 2008-06-01T20:53:13Z by warner
- Milestone changed from eventually to undecided
comment:6 Changed at 2011-05-21T15:05:21Z by davidsarah
I suggest wontfix.
comment:7 Changed at 2011-08-27T01:52:31Z by davidsarah
- Component changed from unknown to code
- Keywords uri captcha added
- Resolution set to wontfix
- Status changed from new to closed
comment:8 Changed at 2013-12-28T16:59:11Z by amontero
- Description modified (diff)
@samuelstoller: check lafs-rpg at https://bitbucket.org/nejucomo/lafs-rpg and/or this use case #2144 in case any can help.
Just fixing the wikiformatting thing.