update readme

This commit is contained in:
Kai Hendry
2012-01-28 13:44:29 +08:00
parent 5c60f06e90
commit df971681d4
2 changed files with 12 additions and 18 deletions

View File

@@ -1,12 +1,13 @@
# <http://greptweet.com> # <http://greptweet.com>
* Authentication free, using <http://dev.twitter.com/doc/get/statuses/user_timeline> * Authentication free, using <http://dev.twitter.com/doc/get/statuses/user_timeline>
* Aim to [suckless](http://suckless.org) by keeping LOC low * Aim to [suckless](http://suckless.org) by keeping lines of code low
* Encourage folks to use `fetch-tweets.sh` themselves and get into shell ;) * Encourage folks to use `fetch-tweets.sh` themselves and get into shell ;)
* Dependencies: curl, libhtml-parser-perl (to decode HTML entities), xmlstarlet, coreutils * Dependencies: curl, libhtml-parser-perl (to decode HTML entities), xmlstarlet, coreutils, PHP
* Look and feel mostly by <http://twitter.github.com/bootstrap/> * Look and feel mostly by <http://twitter.github.com/bootstrap/>
* **Please** review and comment on the code!
# Known issues # Known limitations
* API only allows 3200 tweets to be downloaded this way :(( * API only allows 3200 tweets to be downloaded this way :((
* 150 API limit on the server ... (so clone it and use it yourself!) * 150 API limit on the server ... (so clone it and use it yourself!)
@@ -15,13 +16,13 @@
## Fetching already! ## Fetching already!
Closing a tab whilst creating an account, Closing a window whilst creating an account,
<http://greptweet/create.cgi?id=example>, can cause issues. Need to study <http://greptweet/create.cgi?id=example>, can cause issues. Need to study
<http://mywiki.wooledge.org/ProcessManagement>. <http://mywiki.wooledge.org/ProcessManagement>.
## Twitter can be flaky ## @twitterapi is super flaky
Twitter does not allow the possibility of retrieving more than 3200 tweets. Twitter does not allow the possibility of retrieving more than 3200 tweets. :(
However twitter generally stalls before coming close to this limit. Please However twitter generally stalls before coming close to this limit. Please
consider complaining to Twitter about this issue. consider complaining to Twitter about this issue.
@@ -30,13 +31,6 @@ to any already existing tweets.
I did file <https://dev.twitter.com/discussions/3414>, which later seemed to be fixed. I did file <https://dev.twitter.com/discussions/3414>, which later seemed to be fixed.
# Shell script feedback (progressive loading) on the Web is quite tricky ## Shell script feedback on the Web works by disabling Apache's mod_deflate !
<http://stackoverflow.com/questions/3547488> <http://stackoverflow.com/a/9022823/4534>
Simple approaches tried:
* Outputting more than one should (doesn't work actually)
* Different newer servers of Apache seem to do transfer chunking better
Our host is limited by a "stable" version of Apache.

View File

@@ -104,7 +104,7 @@ fi
if test -f $1.txt if test -f $1.txt
then then
mv $1.txt $temp mv $1.txt $temp
before=$(wc -l < "$temp") before=$(wc -l < "$temp")
else else
before=0 before=0
> $temp > $temp
@@ -117,7 +117,7 @@ echo Before: $before After: $after
if test "$before" -eq "$after" if test "$before" -eq "$after"
then then
echo Uable to retrieve anything new. Approximately $(( $twitter_total - $after)) missing tweets echo Unable to retrieve anything new. Approximately $(( $twitter_total - $after)) missing tweets
rm -f $temp $temp2 rm -f $temp $temp2
exit exit
fi fi
@@ -129,4 +129,4 @@ echo $saved
done done
echo $1 saved $saved tweets of "$twitter_total": You are uptodate! echo $1 saved $saved tweets of "$twitter_total": You are up-to-date!