129 Commits

Author SHA1 Message Date
Zed
401aa26464 Improve proxied mp4 caching 2023-05-21 00:47:09 +02:00
Zed
93208908e6 Merge branch 'master' into feature/mp4-streaming 2023-05-20 22:23:12 +02:00
Zed
e3b3b38a2d Add proper tombstone for subscriber tweets 2023-05-20 02:10:37 +02:00
Zed
b516ec367b Remove outdated tweet from card test 2023-05-20 00:59:34 +02:00
Zed
774063fb5a Fix pinned "TweetPreviewDisplay" crash 2023-05-20 00:52:55 +02:00
Zed
2254a0728c depends -> needs 2023-04-21 23:52:14 +02:00
Zed
fe22a45c7e Make Docker workflow depend on tests 2023-04-21 23:43:46 +02:00
Zed
b67f3062fe Fix card tests 2023-04-21 23:33:52 +02:00
Zed
a64e2286d5 Update badges 2023-04-21 17:41:48 +02:00
Zed
1ac389e7c7 GraphQL timeline (#812)
* Update deps

* Replace profile timeline with GraphQL endpoint

* Update GraphQL endpoint versions

* Use GraphQL for profile media tab

* Fix UserByRestId request

* Improve routing, fixes #814

* Fix token pool JSON

* Deduplicate GraphQL timeline endpoints

* Update list endpoints

* Use GraphQL for list tweets

* Remove debug leftover

* Replace old pinned tweet endpoint with GraphQL

* Validate tweet ID

* Minor token handling fix

* Hide US-only commerce cards

* Update config example

* Remove http pool and gzip from token pool

* Support tombstoned tweets in threads

* Retry GraphQL timeout errors

* Remove unnecessary 401 retry

* Remove broken timeout retry

* Update karax, use new bool attribute feature

* Update card test

* Fix odd edgecase with broken retweets

* Replace search endpoints, switch Bearer token

* Only parse user search if it's a list

* Fix quoted tweet crash

* Fix empty search query handling

* Fix invalid user search errors again
2023-04-21 14:41:30 +02:00
Zed
e2560dc1f1 Prevent search endpoint from discarding tokens 2023-04-21 01:01:18 +02:00
Zed
95893eedaa Add actions workflow to run Selenium tests (#818) 2023-03-28 17:36:48 +02:00
Tommy
78cb405acd Additional hardening for Docker-Compose (#734) 2023-03-28 16:20:07 +02:00
Zed
23f4c6114c Switch to BuildJet runners (#817)
* Update deps

* Use BuildJet runners to speed up arm64
2023-03-28 16:03:15 +02:00
Jonas Chopin-Revel
e93de59118 Edit Dockerfile (#763)
Co-authored-by: Jonas Chopin-Revel <contact@jonas-chopin.com>
2023-03-28 15:18:16 +02:00
Tommy
bc219aa792 Change :ro to :Z (#735)
* Change :ro to :Z

* Update docker-compose.yml

Co-authored-by: Mark Vainomaa <mikroskeem@gmail.com>

---------

Co-authored-by: Mark Vainomaa <mikroskeem@gmail.com>
2023-03-06 23:40:01 +01:00
Zed
7edc5646fe And another one 2023-03-06 11:06:27 +01:00
Zed
ad9083149c Add yet another status error code 2023-03-06 11:00:27 +01:00
Zed
56420a28bd Add another error code to parseStatus 2023-03-06 00:22:27 +01:00
Zed
725f601c9b Add missing error codes for unavailable tweets 2023-03-04 00:46:44 +01:00
Zed
aea884c48e Deduplicate note tweet parsing 2023-03-03 21:27:06 +01:00
HookedBehemoth
368974c803 Update src/parser.nim
Co-authored-by: Zed <zedeus@pm.me>
2023-03-01 16:13:36 +01:00
HookedBehemoth
3a5faded86 parse recently added long tweets 2023-03-01 00:54:10 +01:00
Émilien Devos (perso)
756a39bdd2 Update note about ARM64 docker image + add docker hub links (#799) 2023-02-26 18:24:31 +01:00
Whizanth
33e34c2d17 Add openssl1.1-compat to Dockerfile.arm64 (#800) 2023-02-26 17:47:36 +01:00
Émilien Devos (perso)
514b16c417 Build and push ARM64 docker image (#798) 2023-02-26 15:06:52 +01:00
Whizanth
f4c20a48a0 Add Dockerfile for arm64 (#762) 2023-02-26 14:59:06 +01:00
Zed
c21511037e Uppercase HLS in preference description 2023-02-26 00:43:42 +01:00
Peter Dave Hello
54cc790bdb Remove Instagram -> Bibliogram replacement as it's discontinued (#772)
Close #761

Reference: https://cadence.moe/blog/2022-09-01-discontinuing-bibliogram
2023-02-26 00:39:08 +01:00
Zed
2751504d2f Fix crash on unavailable tweets 2023-02-25 18:25:02 +01:00
Zed
85f14c9a64 Fix crash when parsing non-GraphQL cached tweets 2023-02-24 01:02:28 +01:00
Zed
670a3bca6e Revert "Revert "Replace tweet endpoint with GraphQL""
This reverts commit 36c72f9860.
2023-02-24 01:01:22 +01:00
Zed
36c72f9860 Revert "Replace tweet endpoint with GraphQL"
This reverts commit 19adc658c3.
2023-02-02 12:38:58 +01:00
Zed
19adc658c3 Replace tweet endpoint with GraphQL 2023-02-02 12:31:49 +01:00
Zed
22b51b414b Fix typo to actually cache pinned tweets 2023-01-20 04:55:56 +01:00
Zed
ff61d97a1d Optimize profile fetching and caching 2023-01-20 04:55:42 +01:00
Zed
d38b63f5a9 Bump hls.js 2023-01-09 00:38:02 +01:00
Zed
cd163b26a3 Use webp for thumbnails when available 2023-01-09 00:29:59 +01:00
Zed
88c6135030 Bump dependencies 2023-01-09 00:14:21 +01:00
Zed
6c4c691fe7 Add Crawl-delay to robots.txt 2023-01-09 00:12:07 +01:00
Zed
b5dee25b37 Enable SO_REUSEPORT for simplier load balancing 2023-01-09 00:11:33 +01:00
Andrew Sun
d549a77ebe Fix arm64 build (#756) 2022-12-31 20:46:02 +01:00
Zed
687447cea4 Only retry on 401 once 2022-12-19 10:07:24 +01:00
Zed
d923c20aa6 Explicitly don't support 'model3d' cards
Fixes #597
2022-11-27 18:19:30 +01:00
Zed
9bae009bbc Fix 'unknown' compilation error 2022-11-27 17:27:07 +01:00
Zed
da4d6eabdd Add enum hooks to log parseHook jsony errors 2022-11-27 17:24:29 +01:00
Zed
6a7e56b613 Linting 2022-11-27 16:20:53 +01:00
Zed
78b6877b02 Add "Search (...)" to tab title
Fixes #247
2022-11-27 16:19:30 +01:00
Zed
096d44856f Remove Location field autofocus from search panel 2022-11-27 16:06:36 +01:00
Zed
310b0ab9a1 Update test 2022-11-27 16:00:16 +01:00
Zed
85316f8f8d Reduce usage of strformat, minor perf improvement 2022-11-27 15:50:08 +01:00
Zed
92a6eb0339 Bump Dockerfile Nim to 1.6.10 2022-11-27 12:49:50 +01:00
Zed
b1ef505cff Fix "Show this thread" for pinned threads 2022-11-27 01:57:32 +01:00
Zed
9ef0f649fd Bump dependencies 2022-11-27 01:02:49 +01:00
Zed
8ad8cef875 Revert /c/ removal from YouTube replacer
Fixes #724
2022-11-27 00:40:33 +01:00
Zed
dd18471568 Make YouTube regex case insensitive
Fixes #726
2022-11-27 00:28:49 +01:00
Zed
02557b67c3 Fix minor bug 2022-11-27 00:03:11 +01:00
Kavin
6272297d64 Update hostname for piped (#728) 2022-11-19 02:47:29 +00:00
Zed
2ac3afa5b2 Retry intermittent 401 Unauthorized requests 2022-09-21 05:50:16 +02:00
ringabout
1cf8e67567 bump karax version (#694)
Co-authored-by: xflywind <43030857+xflywind@users.noreply.github.com>
2022-09-15 14:47:02 +02:00
ringabout
8edfda39ec bump packedjson dependency to include a fix for ARC/ORC (#691)
Hello. `shallowCopy` has been removed for ARC/ORC since it does a deep copy for strings/seqs, which breaks the semantics of `shallowCopy`. https://github.com/Araq/packedjson/pull/13 is a fix for `packedjson` to support ARC/ORC. The PR bumps `packedjson` dependency to include [a fix](9e6fbb63cb) for ARC/ORC.
2022-08-30 09:18:46 +02:00
jackyzy823
077ebc95fa make video control bar fit parent div (#683) 2022-08-15 01:07:40 +02:00
Mico
5f31e86e0e Fixes selection issues on iOS devices (#671) 2022-07-21 15:04:27 +02:00
jackyzy823
17cb977e70 fix profile-website css (#669) 2022-07-17 18:42:40 +02:00
flywind
0200cc21a3 add threads:off to config file (#662) 2022-07-05 09:15:38 +02:00
Jules Bertholet
81ec41328d Add redirect for thread links (#647) 2022-06-23 22:47:03 +02:00
HookedBehemoth
5e827c91aa emit body and doctype on iframe embed endpoint (#640) 2022-06-16 18:46:41 +02:00
Frank Moskal
b398136742 update hls.js to v1.1.5 (#636) 2022-06-12 19:11:39 +02:00
Zed
6e490c2dd9 Improve gif html 2022-06-11 23:27:11 +02:00
Zed
00daab1f15 Disable mp4 preloading 2022-06-11 20:18:45 +02:00
Zed
608c3ca8df Lazy load images 2022-06-11 17:41:59 +02:00
zedeus
74c13b372d Format readme 2022-06-09 09:34:06 +02:00
zedeus
5cceca4e93 Bump zippy dependency 2022-06-09 09:25:45 +02:00
Zed
0610f7b890 Fix image route order 2022-06-08 22:20:28 +02:00
Zed
1dab9c9c61 Update mp4Playback preference text 2022-06-06 22:56:48 +02:00
Zed
651941acd1 Implement experimental mp4 streaming 2022-06-06 22:50:28 +02:00
Zed
d407051b66 Downgrade zippy library to fix checksum error 2022-06-05 22:40:10 +02:00
Zed
38bbc67757 Remove old unnecessary rate limit error log 2022-06-05 22:27:22 +02:00
Zed
adaa94d998 Add more logging to the token pool 2022-06-05 21:47:25 +02:00
Zed
138826fb4f Fix Twitterbot rule in robots.txt 2022-06-04 17:55:35 +02:00
Zed
dfb2519870 Explicitly allow Twitterbot to generate previews 2022-06-04 17:52:47 +02:00
minus
c543a1df8c Block search engines via robots.txt (#631)
Prevents instances from being rate limited due to being senselessly
crawled by search engines. Since there is no reason to index Nitter
instances, simply block all robots. Notably, this does *not* affect link
previews (e.g. in various chat software).
2022-06-04 17:48:25 +02:00
Zed
778c6c64cb Use a different quote for testing 2022-06-04 02:26:43 +02:00
Zed
21e8f04fa4 Use strformat more 2022-06-04 02:18:26 +02:00
Zed
6709f6f1b5 Fix "playback disabled" message 2022-06-04 01:32:02 +02:00
Zed
2a5541c126 Merge branch 'fork-graveyard/master' 2022-06-04 01:23:55 +02:00
Zed
93f605f4fe Update deps 2022-06-04 01:11:35 +02:00
Zed
6c83e87292 Update outdated tests 2022-06-04 01:00:44 +02:00
Zed
90135d44e1 Merge branch 'original-image-urls' 2022-06-04 00:47:56 +02:00
girst
e2b8e17f85 use largest resolution mp4 video available 2022-05-18 19:52:17 +02:00
girst
0633ec2c39 Prefer mp4 to m3u8 for Video Playback if proxyVideos is off
m3u8 videos only work when the proxy is enabled. Further, this allows
video playback without Javascript.

This is only done when proxying is disabled to avoid excessive memory
usage on the nitter instance that would result from loading longer
videos in a single chunk.
2022-05-18 19:51:38 +02:00
Zed
683c052036 Fix placeholders in photo rail 2022-05-13 20:31:42 +02:00
Zed
a98cf9245a Fix outdated tests 2022-05-13 19:20:34 +02:00
Zed
3710a4ddec Merge branch '110Percent/master' 2022-05-13 13:36:46 +02:00
Zed
3526318482 Merge pull request #598 from nopdotcom/docker-expose-port
Dockerfile: Move EXPOSE outside the builder stage
2022-05-10 14:10:53 +02:00
Zed
6116a44d83 Merge pull request #566 from dennis1248/master
Implement healthcheck on nitter and redis
2022-05-10 14:10:44 +02:00
Zed
3e7898ecb5 Merge pull request #599 from nopdotcom/docker-ca-certs
Docker: add ca-certificates to runtime
2022-05-10 14:09:13 +02:00
Jay Carlson
a21a33f835 Docker: add ca-certificates to runtime
Current Alpine doesn't ship with the "well-known" CA
certificates. Without them present, outbound HTTPS or TLS connections
aren't even initiated, leading to this at startup:

```
fetching token failed: No SSL/TLS CA certificates found.
```

This adds `ca-certificates` to the runtime image.
2022-04-01 17:54:58 -07:00
Jay Carlson
b5262dc010 Dockerfile: Fix EXPOSE outside the build stages 2022-04-01 16:37:02 -07:00
Zed
2536370cd7 Minor fixes, update tests 2022-04-01 17:56:55 +02:00
Zed
6884f05041 Merge pull request #584 from wasi-master/add-dracula-theme
Add dracula theme
2022-03-21 13:32:50 +01:00
Wasi Master
2b20b24abf Fix contrast for the dracula theme 2022-03-21 17:49:07 +06:00
Wasi Master
731f814d37 Add dracula theme 2022-03-21 11:06:13 +06:00
Zed
2dc91f8453 Merge pull request #571 from jackyzy823/fix-listmember
Fix list members not displaying
2022-03-10 16:46:14 +01:00
jackyzy823
9b97ddc8d8 Fix list members not displaying 2022-03-10 21:24:57 +08:00
Dennis ten Hoove
36a24cf553 Implement healthcheck on nitter and redis 2022-03-04 17:15:59 +01:00
Curtis
ce5cbaeb87 Implement getSource proc 2022-02-26 19:02:34 -05:00
Curtis
baf5bf89e7 Implement requested changes 2022-02-26 19:00:06 -05:00
110Percent
8f65bd583d Remove accidental newline 2022-02-26 13:40:38 -05:00
110Percent
b031ea762e Add source label to main tweets 2022-02-26 13:37:17 -05:00
Zed
ecb6fe4162 Merge pull request #555 from yoshiyoshyosh/master
CSS Tweak to fix avatar issues when "hide tweet stats" is enabled
2022-02-20 21:24:20 +01:00
Zed
e789fdbe5a Merge pull request #557 from jackyzy823/fix-bio-hashtag
Fix bio hashtag
2022-02-20 21:23:53 +01:00
Zed
716e869594 Merge pull request #558 from jackyzy823/fix-banner-404
ignore banner url startwith hash in link preload
2022-02-20 20:54:56 +01:00
jackyzy823
c7a78a7f43 fix slashes between hastags 2022-02-20 14:29:35 +01:00
yoshiyoshyosh
3590ded482 revert previous "fix", move absolute under embed 2022-02-19 20:01:11 -06:00
yoshiyoshyosh
139b7edebc fix embed css 2022-02-19 19:12:26 -06:00
yoshiyoshyosh
2c97cb0763 Update _base.scss 2022-02-19 16:37:12 -06:00
jackyzy823
67c937a3a2 ignore banner url startwith hash in link preload 2022-02-14 14:47:01 +01:00
jackyzy823
7ca3591920 fix non-ascii hashtag regex 2022-02-14 09:20:32 +01:00
Zed
6695784050 Fix user search parser 2022-02-11 22:41:45 +01:00
Zed
a7f585feaa Merge pull request #548 from niladic/fix-rss-source-tag
Fix RSS <source> tag
2022-02-11 10:05:48 +01:00
niladic
3e2756df6f Fix RSS <source> tag 2022-02-11 08:50:32 +01:00
Zed
b01b5eb397 Merge pull request #547 from LainLayer/master
Bumped karax version
2022-02-10 12:14:33 +01:00
Mitarashi
6e4620070e Bumped karax version 2022-02-10 13:13:02 +02:00
decoy-walrus
644fe41a08 Use the correct format string for fetching files from twitter.
Per their docs https://developer.twitter.com/en/docs/twitter-api/v1/data-dictionary/object-model/entities#photo_format
2022-02-08 14:50:56 -05:00
decoy-walrus
4cdb8f78cb Add new endpoint for original resolution images
This change is to work around the issue that chromium based browsers have handling the "name=orig" parameter appended to URLs. This parameter is needed to retrieve the full resolution image from twitter, but causes those browsers to fill in "jpg_name=orig" as the extension on the filename.

This change adds a new endpoint, "/pic/orig/<encoded media>". This new endpoint will internally fetch the URL with ":orig" appended on the end for the full res image. Externally, the endpoint will serve the image without the extra parameter to expose the real extension to the browser.

This new endpoint is used when rendering tweets with attached images. The old endpoint is still in place for all other proxied images, and for any legacy links.

I also updated the "?name=small" parameter to ":small" since that seems to be the new pattern for image sizing.

This should fix issue #458.
2022-02-07 16:21:20 -05:00
Zed
adfd31c530 Fix timeline parser 2022-02-07 21:58:06 +01:00
Zed
8b2fa986b5 Merge pull request #538 from LainLayer/master
Embed CSS tweak
2022-02-05 21:22:08 +01:00
Mitarashi
e01f3960cb Embed CSS tweak 2022-02-05 19:43:13 +02:00
57 changed files with 1117 additions and 593 deletions

View File

@@ -1,4 +1,4 @@
name: CI/CD
name: Docker
on:
push:
@@ -8,31 +8,55 @@ on:
- master
jobs:
build-docker:
runs-on: ubuntu-latest
tests:
uses: ./.github/workflows/run-tests.yml
build-docker-amd64:
needs: [tests]
runs-on: buildjet-2vcpu-ubuntu-2204
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
with:
platforms: all
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
uses: docker/setup-buildx-action@v2
with:
version: latest
- name: Login to DockerHub
uses: docker/login-action@v1
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push
uses: docker/build-push-action@v2
- name: Build and push AMD64 Docker image
uses: docker/build-push-action@v3
with:
context: .
file: ./Dockerfile
platforms: linux/amd64
push: true
tags: zedeus/nitter:latest,zedeus/nitter:${{ github.sha }}
build-docker-arm64:
needs: [tests]
runs-on: buildjet-2vcpu-ubuntu-2204-arm
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v2
with:
version: latest
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push ARM64 Docker image
uses: docker/build-push-action@v3
with:
context: .
file: ./Dockerfile.arm64
platforms: linux/arm64
push: true
tags: zedeus/nitter:latest-arm64,zedeus/nitter:${{ github.sha }}-arm64

45
.github/workflows/run-tests.yml vendored Normal file
View File

@@ -0,0 +1,45 @@
name: Tests
on:
push:
paths-ignore:
- "*.md"
branches-ignore:
- master
workflow_call:
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Cache nimble
id: cache-nimble
uses: actions/cache@v3
with:
path: ~/.nimble
key: nimble-${{ hashFiles('*.nimble') }}
restore-keys: "nimble-"
- uses: actions/setup-python@v4
with:
python-version: "3.10"
cache: "pip"
- uses: jiro4989/setup-nim-action@v1
with:
nim-version: "1.x"
- run: nimble build -d:release -Y
- run: pip install seleniumbase
- run: seleniumbase install chromedriver
- uses: supercharge/redis-github-action@1.5.0
- name: Prepare Nitter
run: |
sudo apt install libsass-dev -y
cp nitter.example.conf nitter.conf
nimble md
nimble scss
- name: Run tests
run: |
./nitter &
pytest -n4 tests

4
.gitignore vendored
View File

@@ -3,9 +3,11 @@ nitter
*.db
/tests/__pycache__
/tests/geckodriver.log
/tests/downloaded_files/*
/tests/downloaded_files
/tests/latest_logs
/tools/gencss
/tools/rendermd
/public/css/style.css
/public/md/*.html
nitter.conf
dump.rdb

View File

@@ -1,6 +1,5 @@
FROM nimlang/nim:1.6.2-alpine-regular as nim
FROM nimlang/nim:1.6.10-alpine-regular as nim
LABEL maintainer="setenforce@protonmail.com"
EXPOSE 8080
RUN apk --no-cache add libsass-dev pcre
@@ -16,8 +15,11 @@ RUN nimble build -d:danger -d:lto -d:strip \
FROM alpine:latest
WORKDIR /src/
RUN apk --no-cache add pcre
RUN apk --no-cache add pcre ca-certificates
COPY --from=nim /src/nitter/nitter ./
COPY --from=nim /src/nitter/nitter.example.conf ./nitter.conf
COPY --from=nim /src/nitter/public ./public
EXPOSE 8080
RUN adduser -h /src/ -D -s /bin/sh nitter
USER nitter
CMD ./nitter

23
Dockerfile.arm64 Normal file
View File

@@ -0,0 +1,23 @@
FROM alpine:3.17 as nim
LABEL maintainer="setenforce@protonmail.com"
RUN apk --no-cache add gcc git libc-dev libsass-dev "nim=1.6.8-r0" nimble pcre
WORKDIR /src/nitter
COPY nitter.nimble .
RUN nimble install -y --depsOnly
COPY . .
RUN nimble build -d:danger -d:lto -d:strip \
&& nimble scss \
&& nimble md
FROM alpine:3.17
WORKDIR /src/
RUN apk --no-cache add ca-certificates pcre openssl1.1-compat
COPY --from=nim /src/nitter/nitter ./
COPY --from=nim /src/nitter/nitter.example.conf ./nitter.conf
COPY --from=nim /src/nitter/public ./public
EXPOSE 8080
CMD ./nitter

View File

@@ -1,6 +1,7 @@
# Nitter
[![Test Matrix](https://github.com/zedeus/nitter/workflows/CI/CD/badge.svg)](https://github.com/zedeus/nitter/actions?query=workflow%3ACI/CD)
[![Test Matrix](https://github.com/zedeus/nitter/workflows/Tests/badge.svg)](https://github.com/zedeus/nitter/actions/workflows/run-tests.yml)
[![Test Matrix](https://github.com/zedeus/nitter/workflows/Docker/badge.svg)](https://github.com/zedeus/nitter/actions/workflows/build-docker.yml)
[![License](https://img.shields.io/github/license/zedeus/nitter?style=flat)](#license)
A free and open source alternative Twitter front-end focused on privacy and
@@ -34,7 +35,7 @@ XMR: 42hKayRoEAw4D6G6t8mQHPJHQcXqofjFuVfavqKeNMNUZfeJLJAcNU19i1bGdDvcdN6romiSscW
## Resources
The wiki contains
The wiki contains
[a list of instances](https://github.com/zedeus/nitter/wiki/Instances) and
[browser extensions](https://github.com/zedeus/nitter/wiki/Extensions)
maintained by the community.
@@ -67,9 +68,10 @@ Twitter account.
## Installation
### Dependencies
* libpcre
* libsass
* redis
- libpcre
- libsass
- redis
To compile Nitter you need a Nim installation, see
[nim-lang.org](https://nim-lang.org/install.html) for details. It is possible to
@@ -108,25 +110,32 @@ performance reasons.
### Docker
#### NOTE: For ARM64/ARM support, please use [unixfox's image](https://quay.io/repository/unixfox/nitter?tab=tags), more info [here](https://github.com/zedeus/nitter/issues/399#issuecomment-997263495)
Page for the Docker image: https://hub.docker.com/r/zedeus/nitter
#### NOTE: For ARM64 support, please use the separate ARM64 docker image: [`zedeus/nitter:latest-arm64`](https://hub.docker.com/r/zedeus/nitter/tags).
To run Nitter with Docker, you'll need to install and run Redis separately
before you can run the container. See below for how to also run Redis using
Docker.
To build and run Nitter in Docker:
```bash
docker build -t nitter:latest .
docker run -v $(pwd)/nitter.conf:/src/nitter.conf -d --network host nitter:latest
```
Note: For ARM64, use this Dockerfile: [`Dockerfile.arm64`](https://github.com/zedeus/nitter/blob/master/Dockerfile.arm64).
A prebuilt Docker image is provided as well:
```bash
docker run -v $(pwd)/nitter.conf:/src/nitter.conf -d --network host zedeus/nitter:latest
```
Using docker-compose to run both Nitter and Redis as different containers:
Change `redisHost` from `localhost` to `nitter-redis` in `nitter.conf`, then run:
```bash
docker-compose up -d
```

View File

@@ -1,5 +1,6 @@
--define:ssl
--define:useStdLib
--threads:off
# workaround httpbeast file upload bug
--assertions:off

View File

@@ -8,10 +8,21 @@ services:
ports:
- "127.0.0.1:8080:8080" # Replace with "8080:8080" if you don't use a reverse proxy
volumes:
- ./nitter.conf:/src/nitter.conf:ro
- ./nitter.conf:/src/nitter.conf:Z,ro
depends_on:
- nitter-redis
restart: unless-stopped
healthcheck:
test: wget -nv --tries=1 --spider http://127.0.0.1:8080/Jack/status/20 || exit 1
interval: 30s
timeout: 5s
retries: 2
user: "998:998"
read_only: true
security_opt:
- no-new-privileges:true
cap_drop:
- ALL
nitter-redis:
image: redis:6-alpine
@@ -20,6 +31,17 @@ services:
volumes:
- nitter-redis:/data
restart: unless-stopped
healthcheck:
test: redis-cli ping
interval: 30s
timeout: 5s
retries: 2
user: "999:1000"
read_only: true
security_opt:
- no-new-privileges:true
cap_drop:
- ALL
volumes:
nitter-redis:

View File

@@ -1,11 +1,11 @@
[Server]
hostname = "nitter.net" # for generating links, change this to your own domain/ip
title = "nitter"
address = "0.0.0.0"
port = 8080
https = false # disable to enable cookies when not using https
httpMaxConnections = 100
staticDir = "./public"
title = "nitter"
hostname = "nitter.net"
[Cache]
listMinutes = 240 # how long to cache list info (not the tweets, so keep it high)
@@ -13,9 +13,9 @@ rssMinutes = 10 # how long to cache rss queries
redisHost = "localhost" # Change to "nitter-redis" if using docker-compose
redisPort = 6379
redisPassword = ""
redisConnections = 20 # connection pool size
redisConnections = 20 # minimum open connections in pool
redisMaxConnections = 30
# max, new connections are opened when none are available, but if the pool size
# new connections are opened when none are available, but if the pool size
# goes above this, they're closed when released. don't worry about this unless
# you receive tons of requests per second
@@ -23,23 +23,22 @@ redisMaxConnections = 30
hmacKey = "secretkey" # random key for cryptographic signing of video urls
base64Media = false # use base64 encoding for proxied media urls
enableRSS = true # set this to false to disable RSS feeds
enableDebug = false # enable request logs and debug endpoints
enableDebug = false # enable request logs and debug endpoints (/.tokens)
proxy = "" # http/https url, SOCKS proxies are not supported
proxyAuth = ""
tokenCount = 10
# minimum amount of usable tokens. tokens are used to authorize API requests,
# but they expire after ~1 hour, and have a limit of 187 requests.
# the limit gets reset every 15 minutes, and the pool is filled up so there's
# always at least $tokenCount usable tokens. again, only increase this if
# you receive major bursts all the time
# but they expire after ~1 hour, and have a limit of 500 requests per endpoint.
# the limits reset every 15 minutes, and the pool is filled up so there's
# always at least `tokenCount` usable tokens. only increase this if you receive
# major bursts all the time and don't have a rate limiting setup via e.g. nginx
# Change default preferences here, see src/prefs_impl.nim for a complete list
[Preferences]
theme = "Nitter"
replaceTwitter = "nitter.net"
replaceYouTube = "piped.kavin.rocks"
replaceYouTube = "piped.video"
replaceReddit = "teddit.net"
replaceInstagram = ""
proxyVideos = true
hlsPlayback = false
infiniteScroll = false

View File

@@ -11,18 +11,18 @@ bin = @["nitter"]
# Dependencies
requires "nim >= 1.4.8"
requires "jester >= 0.5.0"
requires "karax#c71bc92"
requires "sass#e683aa1"
requires "nimcrypto#a5742a9"
requires "markdown#abdbe5e"
requires "packedjson#d11d167"
requires "supersnappy#2.1.1"
requires "jester#baca3f"
requires "karax#5cf360c"
requires "sass#7dfdd03"
requires "nimcrypto#4014ef9"
requires "markdown#158efe3"
requires "packedjson#9e6fbb6"
requires "supersnappy#6c94198"
requires "redpool#8b7c1db"
requires "https://github.com/zedeus/redis#d0a0e6f"
requires "zippy#0.7.3"
requires "flatty#0.2.3"
requires "jsony#d0e69bd"
requires "zippy#ca5989a"
requires "flatty#e668085"
requires "jsony#ea811be"
# Tasks

View File

@@ -0,0 +1,41 @@
body {
--bg_color: #282a36;
--fg_color: #f8f8f2;
--fg_faded: #818eb6;
--fg_dark: var(--fg_faded);
--fg_nav: var(--accent);
--bg_panel: #343746;
--bg_elements: #292b36;
--bg_overlays: #44475a;
--bg_hover: #2f323f;
--grey: var(--fg_faded);
--dark_grey: #44475a;
--darker_grey: #3d4051;
--darkest_grey: #363948;
--border_grey: #44475a;
--accent: #bd93f9;
--accent_light: #caa9fa;
--accent_dark: var(--accent);
--accent_border: #ff79c696;
--play_button: #ffb86c;
--play_button_hover: #ffc689;
--more_replies_dots: #bd93f9;
--error_red: #ff5555;
--verified_blue: var(--accent);
--icon_text: ##F8F8F2;
--tab: #6272a4;
--tab_selected: var(--accent);
--profile_stat: #919cbf;
}
.search-bar > form input::placeholder{
color: var(--fg_faded);
}

File diff suppressed because one or more lines are too long

5
public/robots.txt Normal file
View File

@@ -0,0 +1,5 @@
User-agent: *
Disallow: /
Crawl-delay: 1
User-agent: Twitterbot
Disallow:

View File

@@ -4,122 +4,143 @@ import packedjson
import types, query, formatters, consts, apiutils, parser
import experimental/parser as newParser
proc getGraphUser*(id: string): Future[User] {.async.} =
proc getGraphUser*(username: string): Future[User] {.async.} =
if username.len == 0: return
let
variables = %*{"screen_name": username}
params = {"variables": $variables, "features": gqlFeatures}
js = await fetchRaw(graphUser ? params, Api.userScreenName)
result = parseGraphUser(js)
proc getGraphUserById*(id: string): Future[User] {.async.} =
if id.len == 0 or id.any(c => not c.isDigit): return
let
variables = %*{"userId": id, "withSuperFollowsUserFields": true}
js = await fetchRaw(graphUser ? {"variables": $variables}, Api.userRestId)
variables = %*{"userId": id}
params = {"variables": $variables, "features": gqlFeatures}
js = await fetchRaw(graphUserById ? params, Api.userRestId)
result = parseGraphUser(js)
proc getGraphUserTweets*(id: string; kind: TimelineKind; after=""): Future[Timeline] {.async.} =
if id.len == 0: return
let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = userTweetsVariables % [id, cursor]
params = {"variables": variables, "features": gqlFeatures}
(url, apiId) = case kind
of TimelineKind.tweets: (graphUserTweets, Api.userTweets)
of TimelineKind.replies: (graphUserTweetsAndReplies, Api.userTweetsAndReplies)
of TimelineKind.media: (graphUserMedia, Api.userMedia)
js = await fetch(url ? params, apiId)
result = parseGraphTimeline(js, "user", after)
proc getGraphListTweets*(id: string; after=""): Future[Timeline] {.async.} =
if id.len == 0: return
let
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = listTweetsVariables % [id, cursor]
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphListTweets ? params, Api.listTweets)
result = parseGraphTimeline(js, "list", after)
proc getGraphListBySlug*(name, list: string): Future[List] {.async.} =
let
variables = %*{"screenName": name, "listSlug": list, "withHighlightedLabel": false}
url = graphListBySlug ? {"variables": $variables}
result = parseGraphList(await fetch(url, Api.listBySlug))
variables = %*{"screenName": name, "listSlug": list}
params = {"variables": $variables, "features": gqlFeatures}
result = parseGraphList(await fetch(graphListBySlug ? params, Api.listBySlug))
proc getGraphList*(id: string): Future[List] {.async.} =
let
variables = %*{"listId": id, "withHighlightedLabel": false}
url = graphList ? {"variables": $variables}
result = parseGraphList(await fetch(url, Api.list))
variables = %*{"listId": id}
params = {"variables": $variables, "features": gqlFeatures}
result = parseGraphList(await fetch(graphListById ? params, Api.list))
proc getGraphListMembers*(list: List; after=""): Future[Result[User]] {.async.} =
if list.id.len == 0: return
let
var
variables = %*{
"listId": list.id,
"cursor": after,
"withSuperFollowsUserFields": false,
"withBirdwatchPivots": false,
"withDownvotePerspective": false,
"withReactionsMetadata": false,
"withReactionsPerspective": false,
"withSuperFollowsTweetFields": false
"withReactionsPerspective": false
}
url = graphListMembers ? {"variables": $variables}
if after.len > 0:
variables["cursor"] = % after
let url = graphListMembers ? {"variables": $variables, "features": gqlFeatures}
result = parseGraphListMembers(await fetchRaw(url, Api.listMembers), after)
proc getListTimeline*(id: string; after=""): Future[Timeline] {.async.} =
proc getGraphTweetResult*(id: string): Future[Tweet] {.async.} =
if id.len == 0: return
let
ps = genParams({"list_id": id, "ranking_mode": "reverse_chronological"}, after)
url = listTimeline ? ps
result = parseTimeline(await fetch(url, Api.timeline), after)
variables = tweetResultVariables % id
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphTweetResult ? params, Api.tweetResult)
result = parseGraphTweetResult(js)
proc getUser*(username: string): Future[User] {.async.} =
if username.len == 0: return
let
ps = genParams({"screen_name": username})
json = await fetchRaw(userShow ? ps, Api.userShow)
result = parseUser(json, username)
proc getUserById*(userId: string): Future[User] {.async.} =
if userId.len == 0: return
let
ps = genParams({"user_id": userId})
json = await fetchRaw(userShow ? ps, Api.userShow)
result = parseUser(json)
proc getTimeline*(id: string; after=""; replies=false): Future[Timeline] {.async.} =
proc getGraphTweet(id: string; after=""): Future[Conversation] {.async.} =
if id.len == 0: return
let
ps = genParams({"userId": id, "include_tweet_replies": $replies}, after)
url = timeline / (id & ".json") ? ps
result = parseTimeline(await fetch(url, Api.timeline), after)
cursor = if after.len > 0: "\"cursor\":\"$1\"," % after else: ""
variables = tweetVariables % [id, cursor]
params = {"variables": variables, "features": gqlFeatures}
js = await fetch(graphTweet ? params, Api.tweetDetail)
result = parseGraphConversation(js, id)
proc getMediaTimeline*(id: string; after=""): Future[Timeline] {.async.} =
if id.len == 0: return
let url = mediaTimeline / (id & ".json") ? genParams(cursor=after)
result = parseTimeline(await fetch(url, Api.timeline), after)
proc getReplies*(id, after: string): Future[Result[Chain]] {.async.} =
result = (await getGraphTweet(id, after)).replies
result.beginning = after.len == 0
proc getTweet*(id: string; after=""): Future[Conversation] {.async.} =
result = await getGraphTweet(id)
if after.len > 0:
result.replies = await getReplies(id, after)
proc getGraphSearch*(query: Query; after=""): Future[Result[Tweet]] {.async.} =
let q = genQueryParam(query)
if q.len == 0 or q == emptyQuery:
return Result[Tweet](query: query, beginning: true)
var
variables = %*{
"rawQuery": q,
"count": 20,
"product": "Latest",
"withDownvotePerspective": false,
"withReactionsMetadata": false,
"withReactionsPerspective": false
}
if after.len > 0:
variables["cursor"] = % after
let url = graphSearchTimeline ? {"variables": $variables, "features": gqlFeatures}
result = parseGraphSearch(await fetch(url, Api.search), after)
result.query = query
proc getUserSearch*(query: Query; page="1"): Future[Result[User]] {.async.} =
if query.text.len == 0:
return Result[User](query: query, beginning: true)
var url = userSearch ? {
"q": query.text,
"skip_status": "1",
"count": "20",
"page": page
}
result = parseUsers(await fetchRaw(url, Api.userSearch))
result.query = query
if page.len == 0:
result.bottom = "2"
elif page.allCharsInSet(Digits):
result.bottom = $(parseInt(page) + 1)
proc getPhotoRail*(name: string): Future[PhotoRail] {.async.} =
if name.len == 0: return
let
ps = genParams({"screen_name": name, "trim_user": "true"},
count="18", ext=false)
count="18", ext=false)
url = photoRail ? ps
result = parsePhotoRail(await fetch(url, Api.timeline))
proc getSearch*[T](query: Query; after=""): Future[Result[T]] {.async.} =
when T is User:
const
searchMode = ("result_filter", "user")
parse = parseUsers
fetchFunc = fetchRaw
else:
const
searchMode = ("tweet_search_mode", "live")
parse = parseTimeline
fetchFunc = fetch
let q = genQueryParam(query)
if q.len == 0 or q == emptyQuery:
return Result[T](beginning: true, query: query)
let url = search ? genParams(searchParams & @[("q", q), searchMode], after)
try:
result = parse(await fetchFunc(url, Api.search), after)
result.query = query
except InternalError:
return Result[T](beginning: true, query: query)
proc getTweetImpl(id: string; after=""): Future[Conversation] {.async.} =
let url = tweet / (id & ".json") ? genParams(cursor=after)
result = parseConversation(await fetch(url, Api.tweet), id)
proc getReplies*(id, after: string): Future[Result[Chain]] {.async.} =
result = (await getTweetImpl(id, after)).replies
result.beginning = after.len == 0
proc getTweet*(id: string; after=""): Future[Conversation] {.async.} =
result = await getTweetImpl(id)
if after.len > 0:
result.replies = await getReplies(id, after)
proc getStatus*(id: string): Future[Tweet] {.async.} =
let url = status / (id & ".json") ? genParams()
result = parseStatus(await fetch(url, Api.status))
proc resolve*(url: string; prefs: Prefs): Future[string] {.async.} =
let client = newAsyncHttpClient(maxRedirects=0)
try:

View File

@@ -17,13 +17,13 @@ proc genParams*(pars: openArray[(string, string)] = @[]; cursor="";
result &= p
if ext:
result &= ("ext", "mediaStats")
result &= ("include_ext_alt_text", "true")
result &= ("include_ext_media_availability", "true")
result &= ("include_ext_alt_text", "1")
result &= ("include_ext_media_availability", "1")
if count.len > 0:
result &= ("count", count)
if cursor.len > 0:
# The raw cursor often has plus signs, which sometimes get turned into spaces,
# so we need to them back into a plus
# so we need to turn them back into a plus
if " " in cursor:
result &= ("cursor", cursor.replace(" ", "+"))
else:
@@ -44,7 +44,7 @@ proc genHeaders*(token: Token = nil): HttpHeaders =
})
template updateToken() =
if api != Api.search and resp.headers.hasKey(rlRemaining):
if resp.headers.hasKey(rlRemaining):
let
remaining = parseInt(resp.headers[rlRemaining])
reset = parseInt(resp.headers[rlReset])
@@ -61,12 +61,15 @@ template fetchImpl(result, fetchBody) {.dirty.} =
try:
var resp: AsyncResponse
pool.use(genHeaders(token)):
resp = await c.get($url)
result = await resp.body
template getContent =
resp = await c.get($url)
result = await resp.body
getContent()
if resp.status == $Http503:
badClient = true
raise newException(InternalError, result)
raise newException(BadClientError, "Bad client")
if result.len > 0:
if resp.headers.getOrDefault("content-encoding") == "gzip":
@@ -82,6 +85,9 @@ template fetchImpl(result, fetchBody) {.dirty.} =
raise newException(InternalError, $url)
except InternalError as e:
raise e
except BadClientError as e:
release(token, used=true)
raise e
except Exception as e:
echo "error: ", e.name, ", msg: ", e.msg, ", token: ", token[], ", url: ", url
if "length" notin e.msg and "descriptor" notin e.msg:
@@ -100,7 +106,7 @@ proc fetch*(url: Uri; api: Api): Future[JsonNode] {.async.} =
updateToken()
let error = result.getError
if error in {invalidToken, forbidden, badToken}:
if error in {invalidToken, badToken}:
echo "fetch error: ", result.getError
release(token, invalid=true)
raise rateLimitError()
@@ -115,7 +121,7 @@ proc fetchRaw*(url: Uri; api: Api): Future[string] {.async.} =
if result.startsWith("{\"errors"):
let errors = result.fromJson(Errors)
if errors in {invalidToken, forbidden, badToken}:
if errors in {invalidToken, badToken}:
echo "fetch error: ", errors
release(token, invalid=true)
raise rateLimitError()

View File

@@ -1,28 +1,28 @@
# SPDX-License-Identifier: AGPL-3.0-only
import uri, sequtils
import uri, sequtils, strutils
const
auth* = "Bearer AAAAAAAAAAAAAAAAAAAAAPYXBAAAAAAACLXUNDekMxqa8h%2F40K4moUkGsoc%3DTYfbDKbT3jJPCEVnMYqilB28NHfOPqkca3qaAxGfsyKCs0wRbw"
auth* = "Bearer AAAAAAAAAAAAAAAAAAAAANRILgAAAAAAnNwIzUejRCOuH5E6I8xnZz4puTs%3D1Zv7ttfk8LF81IUq16cHjhLTvJu4FA33AGWWjCpTnA"
api = parseUri("https://api.twitter.com")
activate* = $(api / "1.1/guest/activate.json")
userShow* = api / "1.1/users/show.json"
photoRail* = api / "1.1/statuses/media_timeline.json"
status* = api / "1.1/statuses/show"
search* = api / "2/search/adaptive.json"
timelineApi = api / "2/timeline"
timeline* = timelineApi / "profile"
mediaTimeline* = timelineApi / "media"
listTimeline* = timelineApi / "list.json"
tweet* = timelineApi / "conversation"
userSearch* = api / "1.1/users/search.json"
graphql = api / "graphql"
graphUser* = graphql / "I5nvpI91ljifos1Y3Lltyg/UserByRestId"
graphList* = graphql / "JADTh6cjebfgetzvF3tQvQ/List"
graphListBySlug* = graphql / "ErWsz9cObLel1BF-HjuBlA/ListBySlug"
graphListMembers* = graphql / "Ke6urWMeCV2UlKXGRy4sow/ListMembers"
graphUser* = graphql / "pVrmNaXcxPjisIvKtLDMEA/UserByScreenName"
graphUserById* = graphql / "1YAM811Q8Ry4XyPpJclURQ/UserByRestId"
graphUserTweets* = graphql / "WzJjibAcDa-oCjCcLOotcg/UserTweets"
graphUserTweetsAndReplies* = graphql / "fn9oRltM1N4thkh5CVusPg/UserTweetsAndReplies"
graphUserMedia* = graphql / "qQoeS7szGavsi8-ehD2AWg/UserMedia"
graphTweet* = graphql / "miKSMGb2R1SewIJv2-ablQ/TweetDetail"
graphTweetResult* = graphql / "0kc0a_7TTr3dvweZlMslsQ/TweetResultByRestId"
graphSearchTimeline* = graphql / "gkjsKepM6gl_HmFWoWKfgg/SearchTimeline"
graphListById* = graphql / "iTpgCtbdxrsJfyx0cFjHqg/ListByRestId"
graphListBySlug* = graphql / "-kmqNvm5Y-cVrfvBy6docg/ListBySlug"
graphListMembers* = graphql / "P4NpVZDqUD_7MEM84L-8nw/ListMembers"
graphListTweets* = graphql / "jZntL0oVJSdjhmPcdbw_eA/ListLatestTweetsTimeline"
timelineParams* = {
"include_profile_interstitial_type": "0",
@@ -33,27 +33,89 @@ const
"include_mute_edge": "0",
"include_can_dm": "0",
"include_can_media_tag": "1",
"include_ext_is_blue_verified": "1",
"skip_status": "1",
"cards_platform": "Web-12",
"include_cards": "1",
"include_composer_source": "false",
"include_composer_source": "0",
"include_reply_count": "1",
"tweet_mode": "extended",
"include_entities": "true",
"include_user_entities": "true",
"include_ext_media_color": "false",
"send_error_codes": "true",
"simple_quoted_tweet": "true",
"include_quote_count": "true"
"include_entities": "1",
"include_user_entities": "1",
"include_ext_media_color": "0",
"send_error_codes": "1",
"simple_quoted_tweet": "1",
"include_quote_count": "1"
}.toSeq
searchParams* = {
"query_source": "typed_query",
"pc": "1",
"spelling_corrections": "1"
}.toSeq
## top: nothing
## latest: "tweet_search_mode: live"
## user: "result_filter: user"
## photos: "result_filter: photos"
## videos: "result_filter: videos"
gqlFeatures* = """{
"blue_business_profile_image_shape_enabled": false,
"creator_subscriptions_tweet_preview_api_enabled": true,
"freedom_of_speech_not_reach_fetch_enabled": false,
"graphql_is_translatable_rweb_tweet_is_translatable_enabled": false,
"highlights_tweets_tab_ui_enabled": false,
"interactive_text_enabled": false,
"longform_notetweets_consumption_enabled": true,
"longform_notetweets_inline_media_enabled": false,
"longform_notetweets_richtext_consumption_enabled": true,
"longform_notetweets_rich_text_read_enabled": false,
"responsive_web_edit_tweet_api_enabled": false,
"responsive_web_enhance_cards_enabled": false,
"responsive_web_graphql_exclude_directive_enabled": true,
"responsive_web_graphql_skip_user_profile_image_extensions_enabled": false,
"responsive_web_graphql_timeline_navigation_enabled": false,
"responsive_web_text_conversations_enabled": false,
"responsive_web_twitter_blue_verified_badge_is_enabled": true,
"rweb_lists_timeline_redesign_enabled": true,
"spaces_2022_h2_clipping": true,
"spaces_2022_h2_spaces_communities": true,
"standardized_nudges_misinfo": false,
"tweet_awards_web_tipping_enabled": false,
"tweet_with_visibility_results_prefer_gql_limited_actions_policy_enabled": false,
"tweetypie_unmention_optimization_enabled": false,
"verified_phone_label_enabled": false,
"vibe_api_enabled": false,
"view_counts_everywhere_api_enabled": false
}""".replace(" ", "").replace("\n", "")
tweetVariables* = """{
"focalTweetId": "$1",
$2
"withBirdwatchNotes": false,
"includePromotedContent": false,
"withDownvotePerspective": false,
"withReactionsMetadata": false,
"withReactionsPerspective": false,
"withVoice": false
}"""
tweetResultVariables* = """{
"tweetId": "$1",
"includePromotedContent": false,
"withDownvotePerspective": false,
"withReactionsMetadata": false,
"withReactionsPerspective": false,
"withVoice": false,
"withCommunity": false
}"""
userTweetsVariables* = """{
"userId": "$1", $2
"count": 20,
"includePromotedContent": false,
"withDownvotePerspective": false,
"withReactionsMetadata": false,
"withReactionsPerspective": false,
"withVoice": false,
"withV2Timeline": true
}"""
listTweetsVariables* = """{
"listId": "$1", $2
"count": 20,
"includePromotedContent": false,
"withDownvotePerspective": false,
"withReactionsMetadata": false,
"withReactionsPerspective": false,
"withVoice": false
}"""

View File

@@ -1,2 +1,2 @@
import parser/[user, graphql, timeline]
export user, graphql, timeline
import parser/[user, graphql]
export user, graphql

View File

@@ -1,11 +1,17 @@
import options
import jsony
import user, ../types/[graphuser, graphlistmembers]
from ../../types import User, Result, Query, QueryKind
proc parseGraphUser*(json: string): User =
let raw = json.fromJson(GraphUser)
if raw.data.user.result.reason.get("") == "Suspended":
return User(suspended: true)
result = toUser raw.data.user.result.legacy
result.id = raw.data.user.result.restId
result.verified = result.verified or raw.data.user.result.isBlueVerified
proc parseGraphListMembers*(json, cursor: string): Result[User] =
result = Result[User](

View File

@@ -1,28 +0,0 @@
import std/[strutils, tables]
import jsony
import user, ../types/timeline
from ../../types import Result, User
proc getId(id: string): string {.inline.} =
let start = id.rfind("-")
if start < 0: return id
id[start + 1 ..< id.len]
proc parseUsers*(json: string; after=""): Result[User] =
result = Result[User](beginning: after.len == 0)
let raw = json.fromJson(Search)
if raw.timeline.instructions.len == 0:
return
for e in raw.timeline.instructions[0].addEntries.entries:
let id = e.entryId.getId
if e.entryId.startsWith("user"):
if id in raw.globalObjects.users:
result.content.add toUser raw.globalObjects.users[id]
elif e.entryId.startsWith("cursor"):
let cursor = e.content.operation.cursor
if cursor.cursorType == "Top":
result.top = cursor.value
elif cursor.cursorType == "Bottom":
result.bottom = cursor.value

View File

@@ -66,6 +66,8 @@ proc parseMedia(component: Component; card: UnifiedCard; result: var Card) =
durationMs: videoInfo.durationMillis,
variants: videoInfo.variants
)
of model3d:
result.title = "Unsupported 3D model ad"
proc parseUnifiedCard*(json: string): Card =
let card = json.fromJson(UnifiedCard)
@@ -82,6 +84,10 @@ proc parseUnifiedCard*(json: string): Card =
component.parseMedia(card, result)
of buttonGroup:
discard
of ComponentType.hidden:
result.kind = CardKind.hidden
of ComponentType.unknown:
echo "ERROR: Unknown component type: ", json
case component.kind
of twitterListDetails:

View File

@@ -1,14 +1,14 @@
import std/[algorithm, unicode, re, strutils, strformat, options]
import std/[algorithm, unicode, re, strutils, strformat, options, nre]
import jsony
import utils, slices
import ../types/user as userType
from ../../types import User, Error
from ../../types import Result, User, Error
let
unRegex = re"(^|[^A-z0-9-_./?])@([A-z0-9_]{1,15})"
unRegex = re.re"(^|[^A-z0-9-_./?])@([A-z0-9_]{1,15})"
unReplace = "$1<a href=\"/$2\">@$2</a>"
htRegex = re"(^|[^\w-_./?])([#$])([\w_]+)"
htRegex = nre.re"""(*U)(^|[^\w-_.?])([#$])([\w_]*+)(?!</a>|">|#)"""
htReplace = "$1<a href=\"/search?q=%23$3\">$2$3</a>"
proc expandUserEntities(user: var User; raw: RawUser) =
@@ -29,7 +29,7 @@ proc expandUserEntities(user: var User; raw: RawUser) =
user.bio = orig.replacedWith(replacements, 0 .. orig.len)
.replacef(unRegex, unReplace)
.replacef(htRegex, htReplace)
.replace(htRegex, htReplace)
proc getBanner(user: RawUser): string =
if user.profileBannerUrl.len > 0:
@@ -76,3 +76,12 @@ proc parseUser*(json: string; username=""): User =
else: echo "[error - parseUser]: ", error
result = toUser json.fromJson(RawUser)
proc parseUsers*(json: string; after=""): Result[User] =
result = Result[User](beginning: after.len == 0)
# starting with '{' means it's an error
if json[0] == '[':
let raw = json.fromJson(seq[RawUser])
for user in raw:
result.content.add user.toUser

View File

@@ -1,3 +1,4 @@
import options
import user
type
@@ -10,3 +11,5 @@ type
UserResult = object
legacy*: RawUser
restId*: string
isBlueVerified*: bool
reason*: Option[string]

View File

@@ -17,6 +17,8 @@ type
twitterListDetails
communityDetails
mediaWithDetailsHorizontal
hidden
unknown
Component* = object
kind*: ComponentType
@@ -47,7 +49,7 @@ type
vanity*: string
MediaType* = enum
photo, video
photo, video, model3d
MediaEntity* = object
kind*: MediaType
@@ -70,10 +72,37 @@ type
Text = object
content: string
HasTypeField = Component | Destination | MediaEntity | AppStoreData
TypeField = Component | Destination | MediaEntity | AppStoreData
converter fromText*(text: Text): string = text.content
proc renameHook*(v: var HasTypeField; fieldName: var string) =
proc renameHook*(v: var TypeField; fieldName: var string) =
if fieldName == "type":
fieldName = "kind"
proc enumHook*(s: string; v: var ComponentType) =
v = case s
of "details": details
of "media": media
of "swipeable_media": swipeableMedia
of "button_group": buttonGroup
of "app_store_details": appStoreDetails
of "twitter_list_details": twitterListDetails
of "community_details": communityDetails
of "media_with_details_horizontal": mediaWithDetailsHorizontal
of "commerce_drop_details": hidden
else: echo "ERROR: Unknown enum value (ComponentType): ", s; unknown
proc enumHook*(s: string; v: var AppType) =
v = case s
of "android_app": androidApp
of "iphone_app": iPhoneApp
of "ipad_app": iPadApp
else: echo "ERROR: Unknown enum value (AppType): ", s; androidApp
proc enumHook*(s: string; v: var MediaType) =
v = case s
of "video": video
of "photo": photo
of "model3d": model3d
else: echo "ERROR: Unknown enum value (MediaType): ", s; photo

View File

@@ -12,8 +12,7 @@ let
twRegex = re"(?<=(?<!\S)https:\/\/|(?<=\s))(www\.|mobile\.)?twitter\.com"
twLinkRegex = re"""<a href="https:\/\/twitter.com([^"]+)">twitter\.com(\S+)</a>"""
ytRegex = re"([A-z.]+\.)?youtu(be\.com|\.be)"
igRegex = re"(www\.)?instagram\.com"
ytRegex = re(r"([A-z.]+\.)?youtu(be\.com|\.be)", {reStudy, reIgnoreCase})
rdRegex = re"(?<![.b])((www|np|new|amp|old)\.)?reddit.com"
rdShortRegex = re"(?<![.b])redd\.it\/"
@@ -56,8 +55,6 @@ proc replaceUrls*(body: string; prefs: Prefs; absolute=""): string =
if prefs.replaceYouTube.len > 0 and "youtu" in result:
result = result.replace(ytRegex, prefs.replaceYouTube)
if prefs.replaceYouTube in result:
result = result.replace("/c/", "/")
if prefs.replaceTwitter.len > 0 and ("twitter.com" in body or tco in body):
result = result.replace(tco, https & prefs.replaceTwitter & "/t.co")
@@ -72,11 +69,8 @@ proc replaceUrls*(body: string; prefs: Prefs; absolute=""): string =
if prefs.replaceReddit in result and "/gallery/" in result:
result = result.replace("/gallery/", "/comments/")
if prefs.replaceInstagram.len > 0 and "instagram.com" in result:
result = result.replace(igRegex, prefs.replaceInstagram)
if absolute.len > 0 and "href" in result:
result = result.replace("href=\"/", "href=\"" & absolute & "/")
result = result.replace("href=\"/", &"href=\"{absolute}/")
proc getM3u8Url*(content: string): string =
var matches: array[1, string]

View File

@@ -42,5 +42,11 @@ template use*(pool: HttpPool; heads: HttpHeaders; body: untyped): untyped =
except ProtocolError:
# Twitter closed the connection, retry
body
except BadClientError:
# Twitter returned 503, we need a new client
pool.release(c, true)
badClient = false
c = pool.acquire(heads)
body
finally:
pool.release(c, badClient)

View File

@@ -56,6 +56,7 @@ settings:
port = Port(cfg.port)
staticDir = cfg.staticDir
bindAddr = cfg.address
reusePort = true
routes:
get "/":
@@ -84,20 +85,23 @@ routes:
resp Http500, showError(
&"An error occurred, please {link} with the URL you tried to visit.", cfg)
error RateLimitError:
error BadClientError:
echo error.exc.name, ": ", error.exc.msg
resp Http500, showError("Network error occured, please try again.", cfg)
error RateLimitError:
const link = a("another instance", href = instancesUrl)
resp Http429, showError(
&"Instance has been rate limited.<br>Use {link} or try again later.", cfg)
extend unsupported, ""
extend preferences, ""
extend resolver, ""
extend rss, ""
extend status, ""
extend search, ""
extend timeline, ""
extend list, ""
extend status, ""
extend media, ""
extend list, ""
extend preferences, ""
extend resolver, ""
extend embed, ""
extend debug, ""
extend unsupported, ""

View File

@@ -4,6 +4,8 @@ import packedjson, packedjson/deserialiser
import types, parserutils, utils
import experimental/parser/unifiedcard
proc parseGraphTweet(js: JsonNode): Tweet
proc parseUser(js: JsonNode; id=""): User =
if js.isNull: return
result = User(
@@ -19,13 +21,20 @@ proc parseUser(js: JsonNode; id=""): User =
tweets: js{"statuses_count"}.getInt,
likes: js{"favourites_count"}.getInt,
media: js{"media_count"}.getInt,
verified: js{"verified"}.getBool,
verified: js{"verified"}.getBool or js{"ext_is_blue_verified"}.getBool,
protected: js{"protected"}.getBool,
joinDate: js{"created_at"}.getTime
)
result.expandUserEntities(js)
proc parseGraphUser(js: JsonNode): User =
let user = ? js{"user_results", "result"}
result = parseUser(user{"legacy"})
if "is_blue_verified" in user:
result.verified = true
proc parseGraphList*(js: JsonNode): List =
if js.isNull: return
@@ -38,14 +47,13 @@ proc parseGraphList*(js: JsonNode): List =
result = List(
id: list{"id_str"}.getStr,
name: list{"name"}.getStr,
username: list{"user", "legacy", "screen_name"}.getStr,
userId: list{"user", "rest_id"}.getStr,
username: list{"user_results", "result", "legacy", "screen_name"}.getStr,
userId: list{"user_results", "result", "rest_id"}.getStr,
description: list{"description"}.getStr,
members: list{"member_count"}.getInt,
banner: list{"custom_banner_media", "media_info", "url"}.getImageStr
banner: list{"custom_banner_media", "media_info", "original_img_url"}.getImageStr
)
proc parsePoll(js: JsonNode): Poll =
let vals = js{"binding_values"}
# name format is pollNchoice_*
@@ -73,11 +81,11 @@ proc parseGif(js: JsonNode): Gif =
proc parseVideo(js: JsonNode): Video =
result = Video(
thumb: js{"media_url_https"}.getImageStr,
views: js{"ext", "mediaStats", "r", "ok", "viewCount"}.getStr,
available: js{"ext_media_availability", "status"}.getStr == "available",
views: js{"ext", "mediaStats", "r", "ok", "viewCount"}.getStr($js{"mediaStats", "viewCount"}.getInt),
available: js{"ext_media_availability", "status"}.getStr.toLowerAscii == "available",
title: js{"ext_alt_text"}.getStr,
durationMs: js{"video_info", "duration_millis"}.getInt
# playbackType: mp4
durationMs: js{"video_info", "duration_millis"}.getInt,
playbackType: m3u8
)
with title, js{"additional_media_info", "title"}:
@@ -87,10 +95,18 @@ proc parseVideo(js: JsonNode): Video =
result.description = description.getStr
for v in js{"video_info", "variants"}:
let
contentType = parseEnum[VideoType](v{"content_type"}.getStr("summary"))
url = v{"url"}.getStr
if contentType == mp4:
result.playbackType = mp4
result.variants.add VideoVariant(
contentType: parseEnum[VideoType](v{"content_type"}.getStr("summary")),
contentType: contentType,
bitrate: v{"bitrate"}.getInt,
url: v{"url"}.getStr
url: url,
resolution: if contentType == mp4: getMp4Resolution(url) else: 0
)
proc parsePromoVideo(js: JsonNode): Video =
@@ -181,7 +197,7 @@ proc parseCard(js: JsonNode; urls: JsonNode): Card =
result.url.len == 0 or result.url.startsWith("card://"):
result.url = getPicUrl(result.image)
proc parseTweet(js: JsonNode): Tweet =
proc parseTweet(js: JsonNode; jsCard: JsonNode = newJNull()): Tweet =
if js.isNull: return
result = Tweet(
id: js{"id_str"}.getId,
@@ -202,14 +218,26 @@ proc parseTweet(js: JsonNode): Tweet =
result.expandTweetEntities(js)
# fix for pinned threads
if result.hasThread and result.threadId == 0:
result.threadId = js{"self_thread", "id_str"}.getId
if js{"is_quote_status"}.getBool:
result.quote = some Tweet(id: js{"quoted_status_id_str"}.getId)
# legacy
with rt, js{"retweeted_status_id_str"}:
result.retweet = some Tweet(id: rt.getId)
return
with jsCard, js{"card"}:
# graphql
with rt, js{"retweeted_status_result", "result"}:
# needed due to weird edgecase where the actual tweet data isn't included
if "legacy" in rt:
result.retweet = some parseGraphTweet(rt)
return
if jsCard.kind != JNull:
let name = jsCard{"name"}.getStr
if "poll" in name:
if "image" in name:
@@ -229,7 +257,10 @@ proc parseTweet(js: JsonNode): Tweet =
of "video":
result.video = some(parseVideo(m))
with user, m{"additional_media_info", "source_user"}:
result.attribution = some(parseUser(user))
if user{"id"}.getInt > 0:
result.attribution = some(parseUser(user))
else:
result.attribution = some(parseGraphUser(user))
of "animated_gif":
result.gif = some(parseGif(m))
else: discard
@@ -286,70 +317,11 @@ proc parseGlobalObjects(js: JsonNode): GlobalObjects =
result.users[k] = parseUser(v, k)
for k, v in tweets:
var tweet = parseTweet(v)
var tweet = parseTweet(v, v{"card"})
if tweet.user.id in result.users:
tweet.user = result.users[tweet.user.id]
result.tweets[k] = tweet
proc parseThread(js: JsonNode; global: GlobalObjects): tuple[thread: Chain, self: bool] =
result.thread = Chain()
let thread = js{"content", "item", "content", "conversationThread"}
with cursor, thread{"showMoreCursor"}:
result.thread.cursor = cursor{"value"}.getStr
result.thread.hasMore = true
for t in thread{"conversationComponents"}:
let content = t{"conversationTweetComponent", "tweet"}
if content{"displayType"}.getStr == "SelfThread":
result.self = true
var tweet = finalizeTweet(global, content{"id"}.getStr)
if not tweet.available:
tweet.tombstone = getTombstone(content{"tombstone"})
result.thread.content.add tweet
proc parseConversation*(js: JsonNode; tweetId: string): Conversation =
result = Conversation(replies: Result[Chain](beginning: true))
let global = parseGlobalObjects(? js)
let instructions = ? js{"timeline", "instructions"}
if instructions.len == 0:
return
for e in instructions[0]{"addEntries", "entries"}:
let entry = e{"entryId"}.getStr
if "tweet" in entry or "tombstone" in entry:
let tweet = finalizeTweet(global, e.getEntryId)
if $tweet.id != tweetId:
result.before.content.add tweet
else:
result.tweet = tweet
elif "conversationThread" in entry:
let (thread, self) = parseThread(e, global)
if thread.content.len > 0:
if self:
result.after = thread
else:
result.replies.content.add thread
elif "cursor-showMore" in entry:
result.replies.bottom = e.getCursor
elif "cursor-bottom" in entry:
result.replies.bottom = e.getCursor
proc parseStatus*(js: JsonNode): Tweet =
with e, js{"errors"}:
if e.getError == tweetNotFound:
return
result = parseTweet(js)
if not result.isNil:
result.user = parseUser(js{"user"})
with quote, js{"quoted_status"}:
result.quote = some parseStatus(js{"quoted_status"})
proc parseInstructions[T](res: var Result[T]; global: GlobalObjects; js: JsonNode) =
if js.kind != JArray or js.len == 0:
return
@@ -375,7 +347,12 @@ proc parseTimeline*(js: JsonNode; after=""): Timeline =
result.parseInstructions(global, instructions)
for e in instructions[0]{"addEntries", "entries"}:
var entries: JsonNode
for i in instructions:
if "addEntries" in i:
entries = i{"addEntries", "entries"}
for e in ? entries:
let entry = e{"entryId"}.getStr
if "tweet" in entry or entry.startsWith("sq-I-t") or "tombstone" in entry:
let tweet = finalizeTweet(global, e.getEntryId)
@@ -385,7 +362,7 @@ proc parseTimeline*(js: JsonNode; after=""): Timeline =
result.top = e.getCursor
elif "cursor-bottom" in entry:
result.bottom = e.getCursor
elif entry.startsWith("sq-C"):
elif entry.startsWith("sq-cursor"):
with cursor, e{"content", "operation", "cursor"}:
if cursor{"cursorType"}.getStr == "Bottom":
result.bottom = cursor{"value"}.getStr
@@ -395,7 +372,7 @@ proc parseTimeline*(js: JsonNode; after=""): Timeline =
proc parsePhotoRail*(js: JsonNode): PhotoRail =
for tweet in js:
let
t = parseTweet(tweet)
t = parseTweet(tweet, js{"card"})
url = if t.photos.len > 0: t.photos[0]
elif t.video.isSome: get(t.video).thumb
elif t.gif.isSome: get(t.gif).thumb
@@ -404,3 +381,141 @@ proc parsePhotoRail*(js: JsonNode): PhotoRail =
if url.len == 0: continue
result.add GalleryPhoto(url: url, tweetId: $t.id)
proc parseGraphTweet(js: JsonNode): Tweet =
if js.kind == JNull:
return Tweet()
case js{"__typename"}.getStr
of "TweetUnavailable":
return Tweet()
of "TweetTombstone":
return Tweet(text: js{"tombstone", "text"}.getTombstone)
of "TweetPreviewDisplay":
return Tweet(text: "You're unable to view this Tweet because it's only available to the Subscribers of the account owner.")
of "TweetWithVisibilityResults":
return parseGraphTweet(js{"tweet"})
var jsCard = copy(js{"card", "legacy"})
if jsCard.kind != JNull:
var values = newJObject()
for val in jsCard["binding_values"]:
values[val["key"].getStr] = val["value"]
jsCard["binding_values"] = values
result = parseTweet(js{"legacy"}, jsCard)
result.user = parseGraphUser(js{"core"})
with noteTweet, js{"note_tweet", "note_tweet_results", "result"}:
result.expandNoteTweetEntities(noteTweet)
if result.quote.isSome:
result.quote = some(parseGraphTweet(js{"quoted_status_result", "result"}))
proc parseGraphThread(js: JsonNode): tuple[thread: Chain; self: bool] =
let thread = js{"content", "items"}
for t in js{"content", "items"}:
let entryId = t{"entryId"}.getStr
if "cursor-showmore" in entryId:
let cursor = t{"item", "itemContent", "value"}
result.thread.cursor = cursor.getStr
result.thread.hasMore = true
elif "tweet" in entryId:
let tweet = parseGraphTweet(t{"item", "itemContent", "tweet_results", "result"})
result.thread.content.add tweet
if t{"item", "itemContent", "tweetDisplayType"}.getStr == "SelfThread":
result.self = true
proc parseGraphTweetResult*(js: JsonNode): Tweet =
with tweet, js{"data", "tweetResult", "result"}:
result = parseGraphTweet(tweet)
proc parseGraphConversation*(js: JsonNode; tweetId: string): Conversation =
result = Conversation(replies: Result[Chain](beginning: true))
let instructions = ? js{"data", "threaded_conversation_with_injections", "instructions"}
if instructions.len == 0:
return
for e in instructions[0]{"entries"}:
let entryId = e{"entryId"}.getStr
# echo entryId
if entryId.startsWith("tweet"):
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}:
let tweet = parseGraphTweet(tweetResult)
if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId())
if $tweet.id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("tombstone"):
let id = entryId.getId()
let tweet = Tweet(
id: parseBiggestInt(id),
available: false,
text: e{"content", "itemContent", "tombstoneInfo", "richText"}.getTombstone
)
if id == tweetId:
result.tweet = tweet
else:
result.before.content.add tweet
elif entryId.startsWith("conversationthread"):
let (thread, self) = parseGraphThread(e)
if self:
result.after = thread
else:
result.replies.content.add thread
elif entryId.startsWith("cursor-bottom"):
result.replies.bottom = e{"content", "itemContent", "value"}.getStr
proc parseGraphTimeline*(js: JsonNode; root: string; after=""): Timeline =
result = Timeline(beginning: after.len == 0)
let instructions =
if root == "list": ? js{"data", "list", "tweets_timeline", "timeline", "instructions"}
else: ? js{"data", "user", "result", "timeline_v2", "timeline", "instructions"}
if instructions.len == 0:
return
for i in instructions:
if i{"type"}.getStr == "TimelineAddEntries":
for e in i{"entries"}:
let entryId = e{"entryId"}.getStr
if entryId.startsWith("tweet"):
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}:
let tweet = parseGraphTweet(tweetResult)
if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId())
result.content.add tweet
elif entryId.startsWith("cursor-bottom"):
result.bottom = e{"content", "value"}.getStr
proc parseGraphSearch*(js: JsonNode; after=""): Timeline =
result = Timeline(beginning: after.len == 0)
let instructions = js{"data", "search_by_raw_query", "search_timeline", "timeline", "instructions"}
if instructions.len == 0:
return
for instruction in instructions:
let typ = instruction{"type"}.getStr
if typ == "TimelineAddEntries":
for e in instructions[0]{"entries"}:
let entryId = e{"entryId"}.getStr
if entryId.startsWith("tweet"):
with tweetResult, e{"content", "itemContent", "tweet_results", "result"}:
let tweet = parseGraphTweet(tweetResult)
if not tweet.available:
tweet.id = parseBiggestInt(entryId.getId())
result.content.add tweet
elif entryId.startsWith("cursor-bottom"):
result.bottom = e{"content", "value"}.getStr
elif typ == "TimelineReplaceEntry":
if instruction{"entry_id_to_replace"}.getStr.startsWith("cursor-bottom"):
result.bottom = instruction{"entry", "content", "value"}.getStr

View File

@@ -28,13 +28,13 @@ template `?`*(js: JsonNode): untyped =
if j.isNull: return
j
template `with`*(ident, value, body): untyped =
block:
template with*(ident, value, body): untyped =
if true:
let ident {.inject.} = value
if ident != nil: body
template `with`*(ident; value: JsonNode; body): untyped =
block:
template with*(ident; value: JsonNode; body): untyped =
if true:
let ident {.inject.} = value
if value.notNull: body
@@ -130,9 +130,24 @@ proc getBanner*(js: JsonNode): string =
return
proc getTombstone*(js: JsonNode): string =
result = js{"tombstoneInfo", "richText", "text"}.getStr
result = js{"text"}.getStr
result.removeSuffix(" Learn more")
proc getMp4Resolution*(url: string): int =
# parses the height out of a URL like this one:
# https://video.twimg.com/ext_tw_video/<tweet-id>/pu/vid/720x1280/<random>.mp4
const vidSep = "/vid/"
let
vidIdx = url.find(vidSep) + vidSep.len
resIdx = url.find('x', vidIdx) + 1
res = url[resIdx ..< url.find("/", resIdx)]
try:
return parseInt(res)
except ValueError:
# cannot determine resolution (e.g. m3u8/non-mp4 video)
return 0
proc extractSlice(js: JsonNode): Slice[int] =
result = js["indices"][0].getInt ..< js["indices"][1].getInt
@@ -215,47 +230,37 @@ proc expandUserEntities*(user: var User; js: JsonNode) =
user.bio = user.bio.replacef(unRegex, unReplace)
.replacef(htRegex, htReplace)
proc expandTweetEntities*(tweet: Tweet; js: JsonNode) =
let
orig = tweet.text.toRunes
textRange = js{"display_text_range"}
textSlice = textRange{0}.getInt .. textRange{1}.getInt
hasQuote = js{"is_quote_status"}.getBool
hasCard = tweet.card.isSome
var replyTo = ""
if tweet.replyId != 0:
with reply, js{"in_reply_to_screen_name"}:
tweet.reply.add reply.getStr
replyTo = reply.getStr
let ent = ? js{"entities"}
proc expandTextEntities(tweet: Tweet; entities: JsonNode; text: string; textSlice: Slice[int];
replyTo=""; hasQuote=false) =
let hasCard = tweet.card.isSome
var replacements = newSeq[ReplaceSlice]()
with urls, ent{"urls"}:
with urls, entities{"urls"}:
for u in urls:
let urlStr = u["url"].getStr
if urlStr.len == 0 or urlStr notin tweet.text:
if urlStr.len == 0 or urlStr notin text:
continue
replacements.extractUrls(u, textSlice.b, hideTwitter = hasQuote)
if hasCard and u{"url"}.getStr == get(tweet.card).url:
get(tweet.card).url = u{"expanded_url"}.getStr
with media, ent{"media"}:
with media, entities{"media"}:
for m in media:
replacements.extractUrls(m, textSlice.b, hideTwitter = true)
if "hashtags" in ent:
for hashtag in ent["hashtags"]:
if "hashtags" in entities:
for hashtag in entities["hashtags"]:
replacements.extractHashtags(hashtag)
if "symbols" in ent:
for symbol in ent["symbols"]:
if "symbols" in entities:
for symbol in entities["symbols"]:
replacements.extractHashtags(symbol)
if "user_mentions" in ent:
for mention in ent["user_mentions"]:
if "user_mentions" in entities:
for mention in entities["user_mentions"]:
let
name = mention{"screen_name"}.getStr
slice = mention.extractSlice
@@ -272,5 +277,27 @@ proc expandTweetEntities*(tweet: Tweet; js: JsonNode) =
replacements.deduplicate
replacements.sort(cmp)
tweet.text = orig.replacedWith(replacements, textSlice)
.strip(leading=false)
tweet.text = text.toRunes.replacedWith(replacements, textSlice).strip(leading=false)
proc expandTweetEntities*(tweet: Tweet; js: JsonNode) =
let
entities = ? js{"entities"}
hasQuote = js{"is_quote_status"}.getBool
textRange = js{"display_text_range"}
textSlice = textRange{0}.getInt .. textRange{1}.getInt
var replyTo = ""
if tweet.replyId != 0:
with reply, js{"in_reply_to_screen_name"}:
replyTo = reply.getStr
tweet.reply.add replyTo
tweet.expandTextEntities(entities, tweet.text, textSlice, replyTo, hasQuote)
proc expandNoteTweetEntities*(tweet: Tweet; js: JsonNode) =
let
entities = ? js{"entity_set"}
text = js{"text"}.getStr
textSlice = 0..text.runeLen
tweet.expandTextEntities(entities, text, textSlice)

View File

@@ -80,10 +80,10 @@ genPrefs:
Media:
mp4Playback(checkbox, true):
"Enable mp4 video playback (only for gifs)"
"Enable mp4 video playback"
hlsPlayback(checkbox, false):
"Enable hls video streaming (requires JavaScript)"
"Enable HLS video streaming (requires JavaScript)"
proxyVideos(checkbox, true):
"Proxy video streaming through the server (might be slow)"
@@ -107,10 +107,6 @@ genPrefs:
"Reddit -> Teddit/Libreddit"
placeholder: "Teddit hostname"
replaceInstagram(input, ""):
"Instagram -> Bibliogram"
placeholder: "Bibliogram hostname"
iterator allPrefs*(): Pref =
for k, v in prefList:
for pref in v:

View File

@@ -93,11 +93,11 @@ proc genQueryUrl*(query: Query): string =
if query.text.len > 0:
params.add "q=" & encodeUrl(query.text)
for f in query.filters:
params.add "f-" & f & "=on"
params.add &"f-{f}=on"
for e in query.excludes:
params.add "e-" & e & "=on"
params.add &"e-{e}=on"
for i in query.includes.filterIt(it != "nativeretweets"):
params.add "i-" & i & "=on"
params.add &"i-{i}=on"
if query.since.len > 0:
params.add "since=" & query.since

View File

@@ -118,11 +118,11 @@ proc getUserId*(username: string): Future[string] {.async.} =
pool.withAcquire(r):
result = await r.hGet(name.uidKey, name)
if result == redisNil:
let user = await getUser(username)
let user = await getGraphUser(username)
if user.suspended:
return "suspended"
else:
await cacheUserId(name, user.id)
await all(cacheUserId(name, user.id), cache(user))
return user.id
proc getCachedUser*(username: string; fetch=true): Future[User] {.async.} =
@@ -130,8 +130,7 @@ proc getCachedUser*(username: string; fetch=true): Future[User] {.async.} =
if prof != redisNil:
prof.deserialize(User)
elif fetch:
let userId = await getUserId(username)
result = await getGraphUser(userId)
result = await getGraphUser(username)
await cache(result)
proc getCachedUsername*(userId: string): Future[string] {.async.} =
@@ -142,9 +141,11 @@ proc getCachedUsername*(userId: string): Future[string] {.async.} =
if username != redisNil:
result = username
else:
let user = await getUserById(userId)
let user = await getGraphUserById(userId)
result = user.username
await setEx(key, baseCacheTime, result)
if result.len > 0 and user.id.len > 0:
await all(cacheUserId(result, user.id), cache(user))
proc getCachedTweet*(id: int64): Future[Tweet] {.async.} =
if id == 0: return
@@ -152,8 +153,8 @@ proc getCachedTweet*(id: int64): Future[Tweet] {.async.} =
if tweet != redisNil:
tweet.deserialize(Tweet)
else:
result = await getStatus($id)
if result.isNil:
result = await getGraphTweetResult($id)
if not result.isNil:
await cache(result)
proc getCachedPhotoRail*(name: string): Future[PhotoRail] {.async.} =

View File

@@ -1,5 +1,5 @@
# SPDX-License-Identifier: AGPL-3.0-only
import asyncdispatch, strutils, options
import asyncdispatch, strutils, strformat, options
import jester, karax/vdom
import ".."/[types, api]
import ../views/[embed, tweet, general]
@@ -25,12 +25,12 @@ proc createEmbedRouter*(cfg: Config) =
if convo == nil or convo.tweet == nil:
resp Http404
resp $renderTweetEmbed(convo.tweet, path, prefs, cfg, request)
resp renderTweetEmbed(convo.tweet, path, prefs, cfg, request)
get "/embed/Tweet.html":
let id = @"id"
if id.len > 0:
redirect("/i/status/" & id & "/embed")
redirect(&"/i/status/{id}/embed")
else:
resp Http404

View File

@@ -1,23 +1,25 @@
# SPDX-License-Identifier: AGPL-3.0-only
import strutils, uri
import strutils, strformat, uri
import jester
import router_utils
import ".."/[types, redis_cache, api]
import ../views/[general, timeline, list]
export getListTimeline, getGraphList
template respList*(list, timeline, title, vnode: typed) =
if list.id.len == 0 or list.name.len == 0:
resp Http404, showError("List " & @"id" & " not found", cfg)
resp Http404, showError(&"""List "{@"id"}" not found""", cfg)
let
html = renderList(vnode, timeline.query, list)
rss = "/i/lists/$1/rss" % [@"id"]
rss = &"""/i/lists/{@"id"}/rss"""
resp renderMain(html, request, cfg, prefs, titleText=title, rss=rss, banner=list.banner)
proc title*(list: List): string =
&"@{list.username}/{list.name}"
proc createListRouter*(cfg: Config) =
router list:
get "/@name/lists/@slug/?":
@@ -28,24 +30,22 @@ proc createListRouter*(cfg: Config) =
slug = decodeUrl(@"slug")
list = await getCachedList(@"name", slug)
if list.id.len == 0:
resp Http404, showError("List \"" & @"slug" & "\" not found", cfg)
redirect("/i/lists/" & list.id)
resp Http404, showError(&"""List "{@"slug"}" not found""", cfg)
redirect(&"/i/lists/{list.id}")
get "/i/lists/@id/?":
cond '.' notin @"id"
let
prefs = cookiePrefs()
list = await getCachedList(id=(@"id"))
title = "@" & list.username & "/" & list.name
timeline = await getListTimeline(list.id, getCursor())
timeline = await getGraphListTweets(list.id, getCursor())
vnode = renderTimelineTweets(timeline, prefs, request.path)
respList(list, timeline, title, vnode)
respList(list, timeline, list.title, vnode)
get "/i/lists/@id/members":
cond '.' notin @"id"
let
prefs = cookiePrefs()
list = await getCachedList(id=(@"id"))
title = "@" & list.username & "/" & list.name
members = await getGraphListMembers(list, getCursor())
respList(list, members, title, renderTimelineUsers(members, prefs, request.path))
respList(list, members, list.title, renderTimelineUsers(members, prefs, request.path))

View File

@@ -12,7 +12,8 @@ export httpclient, os, strutils, asyncstreams, base64, re
const
m3u8Mime* = "application/vnd.apple.mpegurl"
maxAge* = "max-age=604800"
mp4Mime* = "video/mp4"
maxAge* = "public, max-age=604800, must-revalidate"
proc safeFetch*(url: string): Future[string] {.async.} =
let client = newAsyncHttpClient()
@@ -20,56 +21,84 @@ proc safeFetch*(url: string): Future[string] {.async.} =
except: discard
finally: client.close()
template respond*(req: asynchttpserver.Request; headers) =
var msg = "HTTP/1.1 200 OK\c\L"
for k, v in headers:
template respond*(req: asynchttpserver.Request; code: HttpCode;
headers: seq[(string, string)]) =
var msg = "HTTP/1.1 " & $code & "\c\L"
for (k, v) in headers:
msg.add(k & ": " & v & "\c\L")
msg.add "\c\L"
yield req.client.send(msg)
yield req.client.send(msg, flags={})
proc getContentLength(res: AsyncResponse): string =
result = "0"
if res.headers.hasKey("content-length"):
result = $res.contentLength
elif res.headers.hasKey("content-range"):
result = res.headers["content-range"]
result = result[result.find('/') + 1 .. ^1]
if result == "*":
result.setLen(0)
proc proxyMedia*(req: jester.Request; url: string): Future[HttpCode] {.async.} =
result = Http200
let
request = req.getNativeReq()
client = newAsyncHttpClient()
hashed = $hash(url)
if request.headers.getOrDefault("If-None-Match") == hashed:
return Http304
let c = newAsyncHttpClient(headers=newHttpHeaders({
"accept": "*/*",
"range": $req.headers.getOrDefault("range")
}))
try:
let res = await client.get(url)
if res.status != "200 OK":
var res = await c.get(url)
if not res.status.startsWith("20"):
return Http404
let hashed = $hash(url)
if request.headers.getOrDefault("If-None-Match") == hashed:
return Http304
var headers = @{
"accept-ranges": "bytes",
"content-type": $res.headers.getOrDefault("content-type"),
"cache-control": maxAge,
"age": $res.headers.getOrDefault("age"),
"date": $res.headers.getOrDefault("date"),
"last-modified": $res.headers.getOrDefault("last-modified")
}
let contentLength =
if res.headers.hasKey("content-length"):
res.headers["content-length", 0]
else:
""
var tries = 0
while tries <= 10 and res.headers.hasKey("transfer-encoding"):
await sleepAsync(100 + tries * 200)
res = await c.get(url)
tries.inc
let headers = newHttpHeaders({
"Content-Type": res.headers["content-type", 0],
"Content-Length": contentLength,
"Cache-Control": maxAge,
"ETag": hashed
})
let contentLength = res.getContentLength
if contentLength.len > 0:
headers.add ("content-length", contentLength)
respond(request, headers)
if res.headers.hasKey("content-range"):
headers.add ("content-range", $res.headers.getOrDefault("content-range"))
respond(request, Http206, headers)
else:
respond(request, Http200, headers)
var (hasValue, data) = (true, "")
while hasValue:
(hasValue, data) = await res.bodyStream.read()
if hasValue:
await request.client.send(data)
await request.client.send(data, flags={})
data.setLen 0
except HttpRequestError, ProtocolError, OSError:
except OSError: discard
except ProtocolError, HttpRequestError:
result = Http404
finally:
client.close()
c.close()
template check*(code): untyped =
template check*(c): untyped =
let code = c
if code != Http200:
resp code
else:
@@ -83,23 +112,27 @@ proc decoded*(req: jester.Request; index: int): string =
if based: decode(encoded)
else: decodeUrl(encoded)
proc getPicUrl*(req: jester.Request): string =
result = decoded(req, 1)
if "twimg.com" notin result:
result.insert(twimg)
if not result.startsWith(https):
result.insert(https)
proc createMediaRouter*(cfg: Config) =
router media:
get "/pic/?":
resp Http404
get re"^\/pic\/orig\/(enc)?\/?(.+)":
let url = getPicUrl(request)
cond isTwitterUrl(parseUri(url)) == true
check await proxyMedia(request, url & "?name=orig")
get re"^\/pic\/(enc)?\/?(.+)":
var url = decoded(request, 1)
if "twimg.com" notin url:
url.insert(twimg)
if not url.startsWith(https):
url.insert(https)
let uri = parseUri(url)
cond isTwitterUrl(uri) == true
let code = await proxyMedia(request, url)
check code
let url = getPicUrl(request)
cond isTwitterUrl(parseUri(url)) == true
check await proxyMedia(request, url)
get re"^\/video\/(enc)?\/?(.+)\/(.+)$":
let url = decoded(request, 2)
@@ -109,8 +142,7 @@ proc createMediaRouter*(cfg: Config) =
resp showError("Failed to verify signature", cfg)
if ".mp4" in url or ".ts" in url or ".m4s" in url:
let code = await proxyMedia(request, url)
check code
check await proxyMedia(request, url)
var content: string
if ".vmap" in url:

View File

@@ -1,5 +1,5 @@
# SPDX-License-Identifier: AGPL-3.0-only
import asyncdispatch, strutils, tables, times, hashes, uri
import asyncdispatch, tables, times, hashes, uri
import jester
@@ -10,6 +10,11 @@ include "../views/rss.nimf"
export times, hashes
proc redisKey*(page, name, cursor: string): string =
result = page & ":" & name
if cursor.len > 0:
result &= ":" & cursor
proc timelineRss*(req: Request; cfg: Config; query: Query): Future[Rss] {.async.} =
var profile: Profile
let
@@ -23,7 +28,7 @@ proc timelineRss*(req: Request; cfg: Config; query: Query): Future[Rss] {.async.
var q = query
q.fromUser = names
profile = Profile(
tweets: await getSearch[Tweet](q, after),
tweets: await getGraphSearch(q, after),
# this is kinda dumb
user: User(
username: name,
@@ -42,8 +47,8 @@ proc timelineRss*(req: Request; cfg: Config; query: Query): Future[Rss] {.async.
template respRss*(rss, page) =
if rss.cursor.len == 0:
let info = case page
of "User": " \"$1\" " % @"name"
of "List": " $1 " % @"id"
of "User": " \"" & @"name" & "\" "
of "List": " \"" & @"id" & "\" "
else: " "
resp Http404, showError(page & info & "not found", cfg)
@@ -67,13 +72,13 @@ proc createRssRouter*(cfg: Config) =
let
cursor = getCursor()
key = "search:" & $hash(genQueryUrl(query)) & ":" & cursor
key = redisKey("search", $hash(genQueryUrl(query)), cursor)
var rss = await getCachedRss(key)
if rss.cursor.len > 0:
respRss(rss, "Search")
let tweets = await getSearch[Tweet](query, cursor)
let tweets = await getGraphSearch(query, cursor)
rss.cursor = tweets.bottom
rss.feed = renderSearchRss(tweets.content, query.text, genQueryUrl(query), cfg)
@@ -84,9 +89,8 @@ proc createRssRouter*(cfg: Config) =
cond cfg.enableRss
cond '.' notin @"name"
let
cursor = getCursor()
name = @"name"
key = "twitter:" & name & ":" & cursor
key = redisKey("twitter", name, getCursor())
var rss = await getCachedRss(key)
if rss.cursor.len > 0:
@@ -101,18 +105,20 @@ proc createRssRouter*(cfg: Config) =
cond cfg.enableRss
cond '.' notin @"name"
cond @"tab" in ["with_replies", "media", "search"]
let name = @"name"
let query =
case @"tab"
of "with_replies": getReplyQuery(name)
of "media": getMediaQuery(name)
of "search": initQuery(params(request), name=name)
else: Query(fromUser: @[name])
let
name = @"name"
tab = @"tab"
query =
case tab
of "with_replies": getReplyQuery(name)
of "media": getMediaQuery(name)
of "search": initQuery(params(request), name=name)
else: Query(fromUser: @[name])
var key = @"tab" & ":" & @"name" & ":"
if @"tab" == "search":
key &= $hash(genQueryUrl(query)) & ":"
key &= getCursor()
let searchKey = if tab != "search": ""
else: ":" & $hash(genQueryUrl(query))
let key = redisKey(tab, name & searchKey, getCursor())
var rss = await getCachedRss(key)
if rss.cursor.len > 0:
@@ -143,18 +149,17 @@ proc createRssRouter*(cfg: Config) =
get "/i/lists/@id/rss":
cond cfg.enableRss
let
id = @"id"
cursor = getCursor()
key =
if cursor.len == 0: "lists:" & @"id"
else: "lists:" & @"id" & ":" & cursor
key = redisKey("lists", id, cursor)
var rss = await getCachedRss(key)
if rss.cursor.len > 0:
respRss(rss, "List")
let
list = await getCachedList(id=(@"id"))
timeline = await getListTimeline(list.id, cursor)
list = await getCachedList(id=id)
timeline = await getGraphListTweets(list.id, cursor)
rss.cursor = timeline.bottom
rss.feed = renderListRss(timeline.content, list, cfg)

View File

@@ -14,25 +14,31 @@ export search
proc createSearchRouter*(cfg: Config) =
router search:
get "/search/?":
if @"q".len > 500:
let q = @"q"
if q.len > 500:
resp Http400, showError("Search input too long.", cfg)
let
prefs = cookiePrefs()
query = initQuery(params(request))
title = "Search" & (if q.len > 0: " (" & q & ")" else: "")
case query.kind
of users:
if "," in @"q":
redirect("/" & @"q")
let users = await getSearch[User](query, getCursor())
resp renderMain(renderUserSearch(users, prefs), request, cfg, prefs)
if "," in q:
redirect("/" & q)
var users: Result[User]
try:
users = await getUserSearch(query, getCursor())
except InternalError:
users = Result[User](beginning: true, query: query)
resp renderMain(renderUserSearch(users, prefs), request, cfg, prefs, title)
of tweets:
let
tweets = await getSearch[Tweet](query, getCursor())
tweets = await getGraphSearch(query, getCursor())
rss = "/search/rss?" & genQueryUrl(query)
resp renderMain(renderTweetSearch(tweets, prefs, getPath()),
request, cfg, prefs, rss=rss)
request, cfg, prefs, title, rss=rss)
else:
resp Http404, showError("Invalid search", cfg)
@@ -42,4 +48,4 @@ proc createSearchRouter*(cfg: Config) =
get "/opensearch":
let url = getUrlPrefix(cfg) & "/search?q="
resp Http200, {"Content-Type": "application/opensearchdescription+xml"},
generateOpenSearchXML(cfg.title, cfg.hostname, url)
generateOpenSearchXML(cfg.title, cfg.hostname, url)

View File

@@ -16,17 +16,21 @@ proc createStatusRouter*(cfg: Config) =
router status:
get "/@name/status/@id/?":
cond '.' notin @"name"
cond not @"id".any(c => not c.isDigit)
let id = @"id"
if id.len > 19 or id.any(c => not c.isDigit):
resp Http404, showError("Invalid tweet ID", cfg)
let prefs = cookiePrefs()
# used for the infinite scroll feature
if @"scroll".len > 0:
let replies = await getReplies(@"id", getCursor())
let replies = await getReplies(id, getCursor())
if replies.content.len == 0:
resp Http404, ""
resp $renderReplies(replies, prefs, getPath())
let conv = await getTweet(@"id", getCursor())
let conv = await getTweet(id, getCursor())
if conv == nil:
echo "nil conv"
@@ -72,3 +76,6 @@ proc createStatusRouter*(cfg: Config) =
get "/i/web/status/@id":
redirect("/i/status/" & @"id")
get "/@name/thread/@id/?":
redirect("/$1/status/$2" % [@"name", @"id"])

View File

@@ -47,10 +47,10 @@ proc fetchProfile*(after: string; query: Query; skipRail=false;
let
timeline =
case query.kind
of posts: getTimeline(userId, after)
of replies: getTimeline(userId, after, replies=true)
of media: getMediaTimeline(userId, after)
else: getSearch[Tweet](query, after)
of posts: getGraphUserTweets(userId, TimelineKind.tweets, after)
of replies: getGraphUserTweets(userId, TimelineKind.replies, after)
of media: getGraphUserTweets(userId, TimelineKind.media, after)
else: getGraphSearch(query, after)
rail =
skipIf(skipRail or query.kind == media, @[]):
@@ -64,6 +64,7 @@ proc fetchProfile*(after: string; query: Query; skipRail=false;
let tweet = await getCachedTweet(user.pinnedTweet)
if not tweet.isNil:
tweet.pinned = true
tweet.user = user
pinned = some tweet
result = Profile(
@@ -82,7 +83,7 @@ proc showTimeline*(request: Request; query: Query; cfg: Config; prefs: Prefs;
rss, after: string): Future[string] {.async.} =
if query.fromUser.len != 1:
let
timeline = await getSearch[Tweet](query, after)
timeline = await getGraphSearch(query, after)
html = renderTweetSearch(timeline, prefs, getPath())
return renderMain(html, request, cfg, prefs, "Multi", rss=rss)
@@ -123,7 +124,7 @@ proc createTimelineRouter*(cfg: Config) =
get "/@name/?@tab?/?":
cond '.' notin @"name"
cond @"name" notin ["pic", "gif", "video"]
cond @"name" notin ["pic", "gif", "video", "search", "settings", "login", "intent", "i"]
cond @"tab" in ["with_replies", "media", "search", ""]
let
prefs = cookiePrefs()
@@ -137,7 +138,7 @@ proc createTimelineRouter*(cfg: Config) =
# used for the infinite scroll feature
if @"scroll".len > 0:
if query.fromUser.len != 1:
var timeline = await getSearch[Tweet](query, after)
var timeline = await getGraphSearch(query, after)
if timeline.content.len == 0: resp Http404
timeline.beginning = true
resp $renderTweetSearch(timeline, prefs, getPath())

View File

@@ -73,9 +73,9 @@
}
}
.profile-joindate, .profile-location, profile-website {
.profile-joindate, .profile-location, .profile-website {
color: var(--fg_faded);
margin: 2px 0;
margin: 1px 0;
width: 100%;
}
}

View File

@@ -98,10 +98,9 @@
}
.avatar {
position: absolute;
&.round {
border-radius: 50%;
-webkit-user-select: none;
}
&.mini {
@@ -121,14 +120,22 @@
background-color: var(--bg_panel);
.tweet-content {
font-size: 18px
font-size: 18px;
}
.tweet-body {
.tweet-body {
display: flex;
flex-direction: column;
max-height: calc(100vh - 0.75em * 2);
}
.card-image img {
height: auto;
}
.avatar {
position: absolute;
}
}
.attribution {
@@ -193,6 +200,7 @@
.tweet-stats {
margin-bottom: -3px;
-webkit-user-select: none;
}
.tweet-stat {
@@ -224,6 +232,7 @@
left: 0;
top: 0;
position: absolute;
-webkit-user-select: none;
&:hover {
background-color: var(--bg_hover);

View File

@@ -23,7 +23,6 @@
font-size: 18px;
}
@media(max-width: 600px) {
.main-tweet .tweet-content {
font-size: 16px;

View File

@@ -3,7 +3,7 @@
video {
max-height: 100%;
max-width: 100%;
width: 100%;
}
.gallery-video {

View File

@@ -1,8 +1,7 @@
# SPDX-License-Identifier: AGPL-3.0-only
import asyncdispatch, httpclient, times, sequtils, json, random
import strutils, tables
import zippy
import types, consts, http_pool
import types, consts
const
maxConcurrentReqs = 5 # max requests at a time per token, to avoid race conditions
@@ -11,9 +10,14 @@ const
failDelay = initDuration(minutes=30)
var
clientPool: HttpPool
tokenPool: seq[Token]
lastFailed: Time
enableLogging = false
let headers = newHttpHeaders({"authorization": auth})
template log(str) =
if enableLogging: echo "[tokens] ", str
proc getPoolJson*(): JsonNode =
var
@@ -37,9 +41,12 @@ proc getPoolJson*(): JsonNode =
let
maxReqs =
case api
of Api.listMembers, Api.listBySlug, Api.list, Api.userRestId: 500
of Api.timeline: 187
else: 180
of Api.listMembers, Api.listBySlug, Api.list, Api.listTweets,
Api.userTweets, Api.userTweetsAndReplies, Api.userMedia,
Api.userRestId, Api.userScreenName,
Api.tweetDetail, Api.tweetResult, Api.search: 500
of Api.userSearch: 900
reqs = maxReqs - token.apis[api].remaining
reqsPerApi[$api] = reqsPerApi.getOrDefault($api, 0) + reqs
@@ -60,25 +67,23 @@ proc fetchToken(): Future[Token] {.async.} =
if getTime() - lastFailed < failDelay:
raise rateLimitError()
let headers = newHttpHeaders({
"accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
"accept-encoding": "gzip",
"accept-language": "en-US,en;q=0.5",
"connection": "keep-alive",
"authorization": auth
})
let client = newAsyncHttpClient(headers=headers)
try:
let
resp = clientPool.use(headers): await c.postContent(activate)
tokNode = parseJson(uncompress(resp))["guest_token"]
resp = await client.postContent(activate)
tokNode = parseJson(resp)["guest_token"]
tok = tokNode.getStr($(tokNode.getInt))
time = getTime()
return Token(tok: tok, init: time, lastUse: time)
except Exception as e:
lastFailed = getTime()
echo "fetching token failed: ", e.msg
echo "[tokens] fetching token failed: ", e.msg
if "Try again" notin e.msg:
echo "[tokens] fetching tokens paused, resuming in 30 minutes"
lastFailed = getTime()
finally:
client.close()
proc expired(token: Token): bool =
let time = getTime()
@@ -100,6 +105,9 @@ proc isReady(token: Token; api: Api): bool =
proc release*(token: Token; used=false; invalid=false) =
if token.isNil: return
if invalid or token.expired:
if invalid: log "discarding invalid token"
elif token.expired: log "discarding expired token"
let idx = tokenPool.find(token)
if idx > -1: tokenPool.delete(idx)
elif used:
@@ -115,6 +123,7 @@ proc getToken*(api: Api): Future[Token] {.async.} =
if not result.isReady(api):
release(result)
result = await fetchToken()
log "added new token to pool"
tokenPool.add result
if not result.isNil:
@@ -143,10 +152,11 @@ proc poolTokens*(amount: int) {.async.} =
except: discard
if not newToken.isNil:
log "added new token to pool"
tokenPool.add newToken
proc initTokenPool*(cfg: Config) {.async.} =
clientPool = HttpPool()
enableLogging = cfg.enableDebug
while true:
if tokenPool.countIt(not it.isLimited(Api.timeline)) < cfg.minTokens:

View File

@@ -7,17 +7,28 @@ genPrefsType()
type
RateLimitError* = object of CatchableError
InternalError* = object of CatchableError
BadClientError* = object of CatchableError
TimelineKind* {.pure.} = enum
tweets
replies
media
Api* {.pure.} = enum
userShow
tweetDetail
tweetResult
timeline
search
tweet
userSearch
list
listBySlug
listMembers
listTweets
userRestId
status
userScreenName
userTweets
userTweetsAndReplies
userMedia
RateLimit* = object
remaining*: int
@@ -34,17 +45,22 @@ type
null = 0
noUserMatches = 17
protectedUser = 22
missingParams = 25
couldntAuth = 32
doesntExist = 34
invalidParam = 47
userNotFound = 50
suspended = 63
rateLimited = 88
invalidToken = 89
listIdOrSlug = 112
tweetNotFound = 144
tweetNotAuthorized = 179
forbidden = 200
badToken = 239
noCsrf = 353
tweetUnavailable = 421
tweetCensored = 422
User* = object
id*: string
@@ -75,6 +91,7 @@ type
contentType*: VideoType
url*: string
bitrate*: int
resolution*: int
Video* = object
durationMs*: int
@@ -144,6 +161,7 @@ type
imageDirectMessage = "image_direct_message"
audiospace = "audiospace"
newsletterPublication = "newsletter_publication"
hidden
unknown
Card* = object
@@ -174,6 +192,8 @@ type
available*: bool
tombstone*: string
location*: string
# Unused, needed for backwards compat
source*: string
stats*: TweetStats
retweet*: Option[Tweet]
attribution*: Option[User]

View File

@@ -1,6 +1,7 @@
# SPDX-License-Identifier: AGPL-3.0-only
import strutils, strformat, uri, tables, base64
import nimcrypto
import types
var
hmacKey: string
@@ -28,6 +29,20 @@ proc setProxyEncoding*(state: bool) =
proc getHmac*(data: string): string =
($hmac(sha256, hmacKey, data))[0 .. 12]
proc getBestMp4VidVariant(video: Video): VideoVariant =
for v in video.variants:
if v.bitrate >= result.bitrate:
result = v
proc getVidVariant*(video: Video; playbackType: VideoType): VideoVariant =
case playbackType
of mp4:
return video.getBestMp4VidVariant
of m3u8, vmap:
for variant in video.variants:
if variant.contentType == playbackType:
return variant
proc getVidUrl*(link: string): string =
if link.len == 0: return
let sig = getHmac(link)
@@ -42,6 +57,12 @@ proc getPicUrl*(link: string): string =
else:
&"/pic/{encodeUrl(link)}"
proc getOrigPicUrl*(link: string): string =
if base64Media:
&"/pic/orig/enc/{encode(link, safe=true)}"
else:
&"/pic/orig/{encodeUrl(link)}"
proc filterParams*(params: Table): seq[(string, string)] =
for p in params.pairs():
if p[1].len > 0 and p[0] notin nitterParams:

View File

@@ -15,7 +15,8 @@ proc renderVideoEmbed*(tweet: Tweet; cfg: Config; req: Request): string =
let node = buildHtml(html(lang="en")):
renderHead(prefs, cfg, req, video=vidUrl, images=(@[thumb]))
tdiv(class="embed-video"):
renderVideo(get(tweet.video), prefs, "")
body:
tdiv(class="embed-video"):
renderVideo(get(tweet.video), prefs, "")
result = doctype & $node

View File

@@ -52,7 +52,7 @@ proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
let opensearchUrl = getUrlPrefix(cfg) & "/opensearch"
buildHtml(head):
link(rel="stylesheet", type="text/css", href="/css/style.css?v=16")
link(rel="stylesheet", type="text/css", href="/css/style.css?v=18")
link(rel="stylesheet", type="text/css", href="/css/fontello.css?v=2")
if theme.len > 0:
@@ -93,14 +93,13 @@ proc renderHead*(prefs: Prefs; cfg: Config; req: Request; titleText=""; desc="";
meta(property="og:site_name", content="Nitter")
meta(property="og:locale", content="en_US")
if banner.len > 0:
if banner.len > 0 and not banner.startsWith('#'):
let bannerUrl = getPicUrl(banner)
link(rel="preload", type="image/png", href=bannerUrl, `as`="image")
for url in images:
let suffix = if "400x400" in url or url.endsWith("placeholder.png"): ""
else: "?name=small"
let preloadUrl = getPicUrl(url & suffix)
let preloadUrl = if "400x400" in url: getPicUrl(url)
else: getSmallPic(url)
link(rel="preload", type="image/png", href=preloadUrl, `as`="image")
let image = getUrlPrefix(cfg) & getPicUrl(url)

View File

@@ -50,7 +50,7 @@ proc renderUserCard*(user: User; prefs: Prefs): VNode =
span:
let url = replaceUrls(user.website, prefs)
icon "link"
a(href=url): text shortLink(url)
a(href=url): text url.shortLink
tdiv(class="profile-joindate"):
span(title=getJoinDateFull(user)):
@@ -78,8 +78,11 @@ proc renderPhotoRail(profile: Profile): VNode =
tdiv(class="photo-rail-grid"):
for i, photo in profile.photoRail:
if i == 16: break
let photoSuffix =
if "format" in photo.url or "placeholder" in photo.url: ""
else: ":thumb"
a(href=(&"/{profile.user.username}/status/{photo.tweetId}#m")):
genImg(photo.url & (if "format" in photo.url: "" else: ":thumb"))
genImg(photo.url & photoSuffix)
proc renderBanner(banner: string): VNode =
buildHtml():
@@ -105,7 +108,7 @@ proc renderProfile*(profile: var Profile; prefs: Prefs; path: string): VNode =
renderBanner(profile.user.banner)
let sticky = if prefs.stickyProfile: " sticky" else: ""
tdiv(class=(&"profile-tab{sticky}")):
tdiv(class=("profile-tab" & sticky)):
renderUserCard(profile.user, prefs)
if profile.photoRail.len > 0:
renderPhotoRail(profile)

View File

@@ -1,11 +1,19 @@
# SPDX-License-Identifier: AGPL-3.0-only
import strutils
import strutils, strformat
import karax/[karaxdsl, vdom, vstyles]
import ".."/[types, utils]
const smallWebp* = "?name=small&format=webp"
proc getSmallPic*(url: string): string =
result = url
if "?" notin url and not url.endsWith("placeholder.png"):
result &= smallWebp
result = getPicUrl(result)
proc icon*(icon: string; text=""; title=""; class=""; href=""): VNode =
var c = "icon-" & icon
if class.len > 0: c = c & " " & class
if class.len > 0: c = &"{c} {class}"
buildHtml(tdiv(class="icon-container")):
if href.len > 0:
a(class=c, title=title, href=href)
@@ -51,29 +59,23 @@ proc buttonReferer*(action, text, path: string; class=""; `method`="post"): VNod
proc genCheckbox*(pref, label: string; state: bool): VNode =
buildHtml(label(class="pref-group checkbox-container")):
text label
if state: input(name=pref, `type`="checkbox", checked="")
else: input(name=pref, `type`="checkbox")
input(name=pref, `type`="checkbox", checked=state)
span(class="checkbox")
proc genInput*(pref, label, state, placeholder: string; class=""): VNode =
proc genInput*(pref, label, state, placeholder: string; class=""; autofocus=true): VNode =
let p = placeholder
buildHtml(tdiv(class=("pref-group pref-input " & class))):
if label.len > 0:
label(`for`=pref): text label
if state.len == 0:
input(name=pref, `type`="text", placeholder=p, value=state, autofocus="")
else:
input(name=pref, `type`="text", placeholder=p, value=state)
input(name=pref, `type`="text", placeholder=p, value=state, autofocus=(autofocus and state.len == 0))
proc genSelect*(pref, label, state: string; options: seq[string]): VNode =
buildHtml(tdiv(class="pref-group pref-input")):
label(`for`=pref): text label
select(name=pref):
for opt in options:
if opt == state:
option(value=opt, selected=""): text opt
else:
option(value=opt): text opt
option(value=opt, selected=(opt == state)):
text opt
proc genDate*(pref, state: string): VNode =
buildHtml(span(class="date-input")):
@@ -82,15 +84,12 @@ proc genDate*(pref, state: string): VNode =
proc genImg*(url: string; class=""): VNode =
buildHtml():
img(src=getPicUrl(url), class=class, alt="")
img(src=getPicUrl(url), class=class, alt="", loading="lazy", decoding="async")
proc getTabClass*(query: Query; tab: QueryKind): string =
result = "tab-item"
if query.kind == tab:
result &= " active"
if query.kind == tab: "tab-item active"
else: "tab-item"
proc getAvatarClass*(prefs: Prefs): string =
if prefs.squareAvatars:
"avatar"
else:
"avatar round"
if prefs.squareAvatars: "avatar"
else: "avatar round"

View File

@@ -47,7 +47,7 @@ Twitter feed for: ${desc}. Generated by ${cfg.hostname}
# let thumb = &"{urlPrefix}{getPicUrl(get(tweet.gif).thumb)}"
# let url = &"{urlPrefix}{getPicUrl(get(tweet.gif).url)}"
<video poster="${thumb}" autoplay muted loop style="max-width:250px;">
<source src="${url}" type="video/mp4"</source></video>
<source src="${url}" type="video/mp4"></video>
#elif tweet.card.isSome:
# let card = tweet.card.get()
# if card.image.len > 0:
@@ -117,7 +117,7 @@ ${renderRssTweets(profile.tweets.content, cfg)}
<atom:link href="${link}" rel="self" type="application/rss+xml" />
<title>${xmltree.escape(list.name)} / @${list.username}</title>
<link>${link}</link>
<description>${getDescription(list.name & " by @" & list.username, cfg)}</description>
<description>${getDescription(&"{list.name} by @{list.username}", cfg)}</description>
<language>en-us</language>
<ttl>40</ttl>
${renderRssTweets(tweets, cfg)}
@@ -135,7 +135,7 @@ ${renderRssTweets(tweets, cfg)}
<atom:link href="${link}" rel="self" type="application/rss+xml" />
<title>Search results for "${escName}"</title>
<link>${link}</link>
<description>${getDescription("Search \"" & escName & "\"", cfg)}</description>
<description>${getDescription(&"Search \"{escName}\"", cfg)}</description>
<language>en-us</language>
<ttl>40</ttl>
${renderRssTweets(tweets, cfg)}

View File

@@ -63,12 +63,10 @@ proc renderSearchPanel*(query: Query): VNode =
hiddenField("f", "tweets")
genInput("q", "", query.text, "Enter search...", class="pref-inline")
button(`type`="submit"): icon "search"
if isPanelOpen(query):
input(id="search-panel-toggle", `type`="checkbox", checked="")
else:
input(id="search-panel-toggle", `type`="checkbox")
label(`for`="search-panel-toggle"):
icon "down"
input(id="search-panel-toggle", `type`="checkbox", checked=isPanelOpen(query))
label(`for`="search-panel-toggle"): icon "down"
tdiv(class="search-panel"):
for f in @["filter", "exclude"]:
span(class="search-title"): text capitalize(f)
@@ -88,7 +86,7 @@ proc renderSearchPanel*(query: Query): VNode =
genDate("until", query.until)
tdiv:
span(class="search-title"): text "Near"
genInput("near", "", query.near, placeholder="Location...")
genInput("near", "", query.near, "Location...", autofocus=false)
proc renderTweetSearch*(results: Result[Tweet]; prefs: Prefs; path: string;
pinned=none(Tweet)): VNode =

View File

@@ -7,16 +7,12 @@ import renderutils
import ".."/[types, utils, formatters]
import general
proc getSmallPic(url: string): string =
result = url
if "?" notin url and not url.endsWith("placeholder.png"):
result &= ":small"
result = getPicUrl(result)
const doctype = "<!DOCTYPE html>\n"
proc renderMiniAvatar(user: User; prefs: Prefs): VNode =
let url = getPicUrl(user.getUserPic("_mini"))
buildHtml():
img(class=(prefs.getAvatarClass & " mini"), src=url)
img(class=(prefs.getAvatarClass & " mini"), src=url, loading="lazy")
proc renderHeader(tweet: Tweet; retweet: string; prefs: Prefs): VNode =
buildHtml(tdiv):
@@ -57,19 +53,21 @@ proc renderAlbum(tweet: Tweet): VNode =
tdiv(class="attachment image"):
let
named = "name=" in photo
orig = if named: photo else: photo & "?name=orig"
small = if named: photo else: photo & "?name=small"
a(href=getPicUrl(orig), class="still-image", target="_blank"):
small = if named: photo else: photo & smallWebp
a(href=getOrigPicUrl(photo), class="still-image", target="_blank"):
genImg(small)
proc isPlaybackEnabled(prefs: Prefs; video: Video): bool =
case video.playbackType
proc isPlaybackEnabled(prefs: Prefs; playbackType: VideoType): bool =
case playbackType
of mp4: prefs.mp4Playback
of m3u8, vmap: prefs.hlsPlayback
proc renderVideoDisabled(video: Video; path: string): VNode =
proc hasMp4Url(video: Video): bool =
video.variants.anyIt(it.contentType == mp4)
proc renderVideoDisabled(playbackType: VideoType; path: string): VNode =
buildHtml(tdiv(class="video-overlay")):
case video.playbackType
case playbackType
of mp4:
p: text "mp4 playback disabled in preferences"
of m3u8, vmap:
@@ -84,36 +82,38 @@ proc renderVideoUnavailable(video: Video): VNode =
p: text "This media is unavailable"
proc renderVideo*(video: Video; prefs: Prefs; path: string): VNode =
let container =
if video.description.len > 0 or video.title.len > 0: " card-container"
else: ""
let
container = if video.description.len == 0 and video.title.len == 0: ""
else: " card-container"
playbackType = if prefs.proxyVideos and video.hasMp4Url: mp4
else: video.playbackType
buildHtml(tdiv(class="attachments card")):
tdiv(class="gallery-video" & container):
tdiv(class="attachment video-container"):
let thumb = getSmallPic(video.thumb)
if not video.available:
img(src=thumb)
renderVideoUnavailable(video)
elif not prefs.isPlaybackEnabled(video):
img(src=thumb)
renderVideoDisabled(video, path)
else:
let vid = video.variants.filterIt(it.contentType == video.playbackType)
let source = getVidUrl(vid[0].url)
case video.playbackType
let canPlay = prefs.isPlaybackEnabled(playbackType)
if video.available and canPlay:
let
vidUrl = video.getVidVariant(playbackType).url
source = if prefs.proxyVideos: getVidUrl(vidUrl)
else: vidUrl
case playbackType
of mp4:
if prefs.muteVideos:
video(poster=thumb, controls="", muted=""):
source(src=source, `type`="video/mp4")
else:
video(poster=thumb, controls=""):
source(src=source, `type`="video/mp4")
video(src=source, poster=thumb, controls="", muted=prefs.muteVideos, preload="metadata")
of m3u8, vmap:
video(poster=thumb, data-url=source, data-autoload="false")
video(poster=thumb, data-url=source, data-autoload="false", muted=prefs.muteVideos)
verbatim "<div class=\"video-overlay\" onclick=\"playVideo(this)\">"
tdiv(class="overlay-circle"): span(class="overlay-triangle")
verbatim "</div>"
else:
img(src=thumb, loading="lazy", decoding="async")
if not canPlay:
renderVideoDisabled(playbackType, path)
else:
renderVideoUnavailable(video)
if container.len > 0:
tdiv(class="card-content"):
h2(class="card-title"): text video.title
@@ -124,14 +124,9 @@ proc renderGif(gif: Gif; prefs: Prefs): VNode =
buildHtml(tdiv(class="attachments media-gif")):
tdiv(class="gallery-gif", style={maxHeight: "unset"}):
tdiv(class="attachment"):
let thumb = getSmallPic(gif.thumb)
let url = getPicUrl(gif.url)
if prefs.autoplayGifs:
video(class="gif", poster=thumb, controls="", autoplay="", muted="", loop=""):
source(src=url, `type`="video/mp4")
else:
video(class="gif", poster=thumb, controls="", muted="", loop=""):
source(src=url, `type`="video/mp4")
video(class="gif", poster=getSmallPic(gif.thumb), autoplay=prefs.autoplayGifs,
controls="", muted="", loop=""):
source(src=getPicUrl(gif.url), `type`="video/mp4")
proc renderPoll(poll: Poll): VNode =
buildHtml(tdiv(class="poll")):
@@ -146,12 +141,12 @@ proc renderPoll(poll: Poll): VNode =
span(class="poll-choice-value"): text percStr
span(class="poll-choice-option"): text poll.options[i]
span(class="poll-info"):
text insertSep($poll.votes, ',') & " votes • " & poll.status
text &"{insertSep($poll.votes, ',')} votes • {poll.status}"
proc renderCardImage(card: Card): VNode =
buildHtml(tdiv(class="card-image-container")):
tdiv(class="card-image"):
img(src=getPicUrl(card.image), alt="")
img(src=getPicUrl(card.image), alt="", loading="lazy")
if card.kind == player:
tdiv(class="card-overlay"):
tdiv(class="overlay-circle"):
@@ -325,7 +320,7 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
if tweet.attribution.isSome:
renderAttribution(tweet.attribution.get(), prefs)
if tweet.card.isSome:
if tweet.card.isSome and tweet.card.get().kind != hidden:
renderCard(tweet.card.get(), prefs, path)
if tweet.photos.len > 0:
@@ -344,7 +339,7 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
renderQuote(tweet.quote.get(), prefs, path)
if mainTweet:
p(class="tweet-published"): text getTime(tweet)
p(class="tweet-published"): text &"{getTime(tweet)}"
if tweet.mediaTags.len > 0:
renderMediaTags(tweet.mediaTags)
@@ -356,7 +351,12 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
a(class="show-thread", href=("/i/status/" & $tweet.threadId)):
text "Show this thread"
proc renderTweetEmbed*(tweet: Tweet; path: string; prefs: Prefs; cfg: Config; req: Request): VNode =
buildHtml(tdiv(class="tweet-embed")):
proc renderTweetEmbed*(tweet: Tweet; path: string; prefs: Prefs; cfg: Config; req: Request): string =
let node = buildHtml(html(lang="en")):
renderHead(prefs, cfg, req)
renderTweet(tweet, prefs, path, mainTweet=true)
body:
tdiv(class="tweet-embed"):
renderTweet(tweet, prefs, path, mainTweet=true)
result = doctype & $node

1
tests/requirements.txt Normal file
View File

@@ -0,0 +1 @@
seleniumbase

View File

@@ -3,11 +3,6 @@ from parameterized import parameterized
card = [
['Thom_Wolf/status/1122466524860702729',
'pytorch/fairseq',
'Facebook AI Research Sequence-to-Sequence Toolkit written in Python. - GitHub - pytorch/fairseq: Facebook AI Research Sequence-to-Sequence Toolkit written in Python.',
'github.com', True],
['nim_lang/status/1136652293510717440',
'Version 0.20.0 released',
'We are very proud to announce Nim version 0.20. This is a massive release, both literally and figuratively. It contains more than 1,000 commits and it marks our release candidate for version 1.0!',
@@ -25,10 +20,10 @@ card = [
]
no_thumb = [
['Bountysource/status/1141879700639215617',
'Post a bounty on kivy/plyer!',
'Automation and Screen Reader Support',
'bountysource.com'],
['Thom_Wolf/status/1122466524860702729',
'facebookresearch/fairseq',
'Facebook AI Research Sequence-to-Sequence Toolkit written in Python. - GitHub - facebookresearch/fairseq: Facebook AI Research Sequence-to-Sequence Toolkit written in Python.',
'github.com'],
['brent_p/status/1088857328680488961',
'Hts Nim Sugar',
@@ -40,14 +35,9 @@ no_thumb = [
'A sample of a Qt app written using mostly nim. Contribute to sinkingsugar/nimqt-example development by creating an account on GitHub.',
'github.com'],
['mobile_test/status/490378953744318464',
'Nantasket Beach',
'Explore this photo titled Nantasket Beach by Ben Sandofsky (@sandofsky) on 500px',
'500px.com'],
['nim_lang/status/1082989146040340480',
'Nim in 2018: A short recap',
'Posted in r/programming by u/miran1',
'Posted by u/miran1 - 36 votes and 46 comments',
'reddit.com']
]
@@ -81,7 +71,7 @@ class CardTest(BaseTestCase):
c = Card(Conversation.main + " ")
self.assert_text(title, c.title)
self.assert_text(destination, c.destination)
self.assertIn('_img', self.get_image_url(c.image + ' img'))
self.assertIn('/pic/', self.get_image_url(c.image + ' img'))
if len(description) > 0:
self.assert_text(description, c.description)
if large:
@@ -104,7 +94,7 @@ class CardTest(BaseTestCase):
c = Card(Conversation.main + " ")
self.assert_text(title, c.title)
self.assert_text(destination, c.destination)
self.assertIn('_img', self.get_image_url(c.image + ' img'))
self.assertIn('/pic/', self.get_image_url(c.image + ' img'))
self.assert_element_visible('.card-overlay')
if len(description) > 0:
self.assert_text(description, c.description)

View File

@@ -17,11 +17,6 @@ protected = [
invalid = [['thisprofiledoesntexist'], ['%']]
banner_color = [
['nim_lang', '22, 25, 32'],
['rustlang', '35, 31, 32']
]
banner_image = [
['mobile_test', 'profile_banners%2F82135242%2F1384108037%2F1500x500']
]
@@ -74,12 +69,6 @@ class ProfileTest(BaseTestCase):
self.open_nitter('user')
self.assert_text('User "user" has been suspended')
@parameterized.expand(banner_color)
def test_banner_color(self, username, color):
self.open_nitter(username)
banner = self.find_element(Profile.banner + ' a')
self.assertIn(color, banner.value_of_css_property('background-color'))
@parameterized.expand(banner_image)
def test_banner_image(self, username, url):
self.open_nitter(username)

View File

@@ -3,14 +3,18 @@ from parameterized import parameterized
text = [
['elonmusk/status/1138136540096319488',
'Tesla Owners Online', '@Model3Owners',
'TREV PAGE', '@Model3Owners',
"""As of March 58.4% of new car sales in Norway are electric.
What are we doing wrong? reuters.com/article/us-norwa…"""],
['nim_lang/status/924694255364341760',
'Hacker News', '@newsycombinator',
'Why Rust fails hard at scientific computing andre-ratsimbazafy.com/why-r…']
['nim_lang/status/1491461266849808397#m',
'Nim language', '@nim_lang',
"""What's better than Nim 1.6.0?
Nim 1.6.2 :)
nim-lang.org/blog/2021/12/17…"""]
]
image = [

View File

@@ -16,7 +16,7 @@ timeline = [
]
status = [
[20, 'jack⚡️', 'jack', '21 Mar 2006', 'just setting up my twttr'],
[20, 'jack', 'jack', '21 Mar 2006', 'just setting up my twttr'],
[134849778302464000, 'The Twoffice', 'TheTwoffice', '11 Nov 2011', 'test'],
[105685475985080322, 'The Twoffice', 'TheTwoffice', '22 Aug 2011', 'regular tweet'],
[572593440719912960, 'Test account', 'mobile_test', '3 Mar 2015', 'testing test']
@@ -71,7 +71,7 @@ emoji = [
retweet = [
[7, 'mobile_test_2', 'mobile test 2', 'Test account', '@mobile_test', '1234'],
[3, 'mobile_test_8', 'mobile test 8', 'jack⚡️', '@jack', 'twttr']
[3, 'mobile_test_8', 'mobile test 8', 'jack', '@jack', 'twttr']
]
reply = [