Profile overview for TheSubversives.
Submission statistics

This user has mostly submitted to the following subverses (showing top 5):

3 submissions to GreatAwakening

This user has so far shared a total of 2 links, started a total of 1 discussions and submitted a total of 383 comments.

Voting habits

Submissions: This user has upvoted 5127 and downvoted 82 submissions.

Comments: This user has upvoted 1458 and downvoted 555 comments.

Comment ratings

3 highest rated comments:

Something you should explore further. Impressive, most impressive. Q (2998) submitted by 721-AQZ to GreatAwakening

TheSubversives 0 points 20 points (+20|-0) ago

Fuck me, I researched this exact shit when vault 7 was released... in the middle had my entire hdd remotely wiped and never touched it again. Q dropping the relatively recent ‘keystone’ drops never primed my brain to go back and look again to make the correct connection.

I’m so glad we have so many eyes on this and intelligent patriots among us, because I bet there were many others that even knew about this but never retroactively made the connection as well or slowly forgot over time. This is a great reminder to always revisit what you think you know, you will always make new connections. I wanna buy this anon a beer, great find.

God bless anon, for we are much stronger together! WWG1WGA

Why isn't there more outrage regarding the treatment of Julian Assange? submitted by realneil to GreatAwakening

TheSubversives 0 points 17 points (+17|-0) ago

Snowden was a CIA operative and the ‘show’ was designed to weaken the NSA’s legal spying apparatus and strengthen. The CIA’s illegal Spying operations. (CIA. Funding GOOG ++ AMZN, DARPA creating Lifelog. -> facebook.)

This has become the Way the CIA skirts the issue of not being able to Legally spy on US citizens. There is ‘background’ data being collected, like GPS data from Twitter, Active mic and camera + GPS from FB App + F8 Algo collections at the ‘physical’ level As well on all comms. It’s then routed. To black project data centers (like under Denver. Intl) There;s 7, named after the seven dwarves of Snow White.

Vale Srayzie, Vale Shizy. Drums.... Drums in the Deep... They are coming submitted by MolochHunter to GreatAwakening

TheSubversives 3 points 13 points (+16|-3) ago

Completely agree.

Thank all of you for your service to this movement. Without people of integrity, we'd be no better than the fuckfest that is the chans. Oceans of good information cloaked from good people because of the non-navigable terrain of porn and other crazy shit.

Just out of curiosity, how are mods selected?

3 lowest rated comments:

SUPER HIGH RESOLUTION VERSION OF ENCYCLOPEDIA BRITANNICA 1771 EDITION submitted by srayzie to GreatAwakening

TheSubversives 0 points 0 points (+0|-0) ago

Thank you for this! Is there a subvoat/other community where I could find more resources like this?

This is immensely fascinating and I'm slowly building a large archive of things like this, so it'd be nice to find more!

SUPER HIGH RESOLUTION VERSION OF ENCYCLOPEDIA BRITANNICA 1771 EDITION submitted by srayzie to GreatAwakening

TheSubversives 0 points 0 points (+0|-0) ago

Thank you so much! I'll use wget to create offline archives of both of these sites! Much appreciated!

SUPER HIGH RESOLUTION VERSION OF ENCYCLOPEDIA BRITANNICA 1771 EDITION submitted by srayzie to GreatAwakening

TheSubversives 0 points 0 points (+0|-0) ago

It's awesome! I use: wget -mkEp https://qmap.pub

qmap.pub is just an example, you may use any website. Depending on the total size of the website, it might take a while to download. Most sites will be a couple hundred MB. The largest I've downloaded was David Wilcock's divinecosmos.com and that was 16GB.

These arguments allow you to create an offline mirror, it recursively follows all directories and downloads them. It also appends the .html filetype on the correct files. It also changes the links to the correct destination on your local copy instead of redirecting you to an online version when clicked. Wget works great for sites that are not extremely reliant on JavaScript features! You can look up what each argument does in the wget documentation if you're curious!

Just make sure to use or not use "www" depending on whether the website uses it. Took me a while to figure out why it wasn't working properly a few times because of that. EX: voat.co does not, neither does qmap.pub.