Bitcoin is Being Hot-Wired for Settlement by Jeff Garzik ...
(PDF) Understanding Bitcoins: Facts and Questions
How does bitcoin scale in performance? (CPU and RAM usage ...
Secure and anonymous decentralized Bitcoin mixing ...
I've been sharing conspiracies on reddit longer than this sub has been around. I have a story to tell.
This story is mostly crafted from my own experiences, my conversations with some of the people involved, and the rest is my own guesswork as I try to fill in the gaps...so bear with me! That's why I wanted to share with this community, which I've watched grow over the years. I remember posting about the death of Barry Jennings (who witnessed explosions in the WTC on 9/11) the day after it happened. This was before /conspiracy (or right around when it was formed), and I remember thinking "we really need a sub for conspiracies on reddit!" And here we are, 12 years later and over 1.3 million subscribers...incredible! So... My story starts with a young man. We'll call him Andrew. Andrew grew up in the 90's in a coastal US town and quickly blossomed into a tech whiz at a young age. He began building his own computers, and after a brief stint using Windows, he decided that Bill Gates was everything wrong with technology (and the world), and he made it his mission to make sure folks like Gates were NOT the future of computers. He really believed that the use of technology was a fundamental human right, and that charging people for "proprietary" OS's that hid their source code was a violation of these rights. He saw a possible Deus Ex-like future, with a technocracy literally around the corner if we didn't act now. Andrew soon joined the Free Software Foundation and began rubbing elbows with the likes of Richard Stallman. He begun exclusively using GNU/Linux and was the type to correct you if you called it just "Linux". He also began visiting tech-savvy forums like slashdot and started networking in earnest. By 2006 (his senior year of high school) Andrew was completely over his "education" and decided to just drop out completely. Shockingly, a college accepted him anyway. A small East Coast school had been actively courting Andrew, and when they learned he had failed to get his HS diploma, they accepted him anyway! Now sometime during this period Andrew went to Iceland and stayed in Reykjavik for several months. This trip may have happened during the summer, fall, or early winter of 2006. The reason for his trip had something to do with his efforts in the FSF or similar group. The possible significance of this trip will become clear as we go on. What is clear is that Andrew started college in the fall of 2006, and that the circumstances were unusual. Andrew soon met several like-minded individuals and began building a social and technological network at his school. Two individuals in particular would become key players in his life (one more prominently in this story, but the other was significant as well), and eventually the 3 would live together in town for several years. But for now let's stick with Andrew. Andrew had an idea to build a social network for his college. Except, it wasn't just a network, it was a wiki for information about the school...and beyond. Soon, it began to morph into something much bigger in Andrew's mind. He saw his project as being one of many data "hubs" for leaks of important documents and otherwise sensitive information. So yeah, he saw the opportunity for a wiki for leaks (see where this is going yet...?). As his ambitions grew, his behavior started to become increasingly erratic. He was caught with drugs and arrested. Strangely, the charges were pretty much dropped and he was given a slap on the wrist. Eventually he decided to leave the school, but still lived in town and had access to the servers on campus. By 2010 Andrew was still living in the small town with his two "hacker" buddies, who were still enrolled at the school. This house was in some ways legendary. It appears that many "interesting" people spent time at or visited the residence. Indeed, some of the early movers and shakers of /conspiracy itself passed through. There was usually a full NO2 tank for anyone who was into that kinda thing, and they were stocked with every hallucinogen and research chemical known to man. It was also likely under surveillance by multiple intelligence agencies (NSA/Mossad/etc). Over time, the mental state of Andrew was slowly starting to deteriorate, which wasn't helped by his abuse of drugs. Still, Andrew decided to move his base of operations to Europe, spending time in Belgium, the Czech Republic and elsewhere. One of his housemates was soon to join him on his adventures in Europe and elsewhere abroad. We'll call him "Aaron." Aaron had a very similar story and upbringing as Andrew. Aaron was also from a coastal US town and was born into privilege. He was also, supposedly, born into a family with some serious connections to intelligence agencies, including an uncle with ties to the NSA, and both parents connected to military brass. By 2015, Andrew and Aaron were living together in the Czech Republic. During this time they were working directly and/or indirectly for the NSA (via Cisco and other companies). You see, the "college" they met at was actually a front for the recruitment of kids into the IC. Apparently, many "schools" in the US function that way. Go figure. Their intelligence and valuable skill set (hacking etc) made them valuable assets. They were also possibly involved with the distribution of certain "research chemicals" (of the 2C* variety) to dignitaries and their entourages (in one example, they provided 2CB to a group with David Cameron). In addition, Andrew was allegedly involved with, or stumbled upon, an NSA-linked surveillance project directed at the entire country of Malaysia, while Aaron was involved with Cisco. Aaron himself had gotten into hot water for releasing damaging information about the NSA, and even claimed to be an NSA whistleblower, and was also possibly the individual who leaked the 2014 (or 2015) Bilderberg meeting list. And then things went bad. Andrew quit the Malaysia project and Aaron left Cisco. It seems Andrew and Aaron were "set up" during a fiery false flag event in the Czech Republic in 2015. It may have happened at an embassy, but it's unclear which. There is no information on the web about anything like this (afaik). Aaron was immediately targeted and spent several years on the run. Allegedly, he was added to the list of victims in the so-called "Great Game". The Great Game is the term used for an international assassination program where intelligence agencies share a list of targets to be neutralized. The German BND and Mossad are heavily involved, as other networks. Individuals targeted by the Great Game may be offed by actual assassins, or by NPC-like humans whose minds will be influenced by mind control tech (a la Matrix...say influencing someone to ram your car unwittingly ie). As Aaron went on the lam, Andrew soon returned to the US, shell-shocked by his experience. Both Andrew and Aaron continue to suffer from some sort of PTSD from these series of events, rendering Andrew largely incapacitated and Aaron scattered and discombobulated. The Meat of the Matter OK...where does that leave us? Why am I sharing all of this? I think there's much more to this story. So let's start speculating! Everything I'm about to say is stuff that was told to me personally. I can't vouch for any of this information, though obviously I thought it was compelling enough to share. Here's the gist: The so-called whistleblowers you see in the media are almost all fake. This includes: Edward Snowden, Julian Assange, Thomas Drake and William Binney (hey look, his AMA is pinned on this sub right now...no comment!). These individuals, and others, are controlled opposition. The real whistleblowers are severely punished. For example, Bradley Manning was punished with chemical castration in jail. His "transformation" was chemically induced torture. Andrew was not alone in his passion. There were lots of other young visionaries like him who dreamed of a freer and more transparent world. In this story, Julian Assange was an intelligence asset...a psyop meant to steal the thunder from real activists like Andrew. In this story, a small college-based "wiki" for government leaks was used as the model for an intelligence operation known as "wikileaks". In this story, Andrew traveled to Iceland at some point in 2006. When was Wikileaks founded? Wikileaks was founded by Julian Assange in December 2006, in Iceland. Aaron discovered (legally, like Manning who had clearance to access all the data he leaked) damning information about surveillance happening by the NSA, specifically against recruits entering the US army and elsewhere. In this story, the "Andrew" identity was co-opted and turned into "Julian Assange", and "Aaron" became "Edward Snowden". Granted, there were probably other people that these whistleblower imposters were modeled after, but Andrew and Aaron seem like very strong contenders for some of this inspiration. Now, much of the following may be gobbledygook (lol I spelled that right first try!) for all I know, but since I'm having a really hard time making sense of it all, I'll just include everything I can and let you guys run with it. Here are some phrases, ideas, terms and people of note that may be involved with this story...MODS: None of this is doxing! All of the links of people are wikipedia pages or published interviews/articles. So yeah. Not dox!
Rootkit: These are currency and the weapons of the intelligence agencies that allow access to any computer or OS on the planet.
"schizo-affective disorder" doesn't exist, it's cover for EM warfare. For those they can't attack chemically (like Manning), they punish via EM. This technology has been used for decades. Both Andrew and Aaron were likely subjected to this punishment after they stopped playing ball. It likely continues for them both as well.
IN CONCLUSION I don't know how these terms, theories and individuals fit into this story, but that they may be somehow related. Hopefully there are enough bread crumbs in here to keep some of you busy! Any help/insight would be appreciated. I confess I'm not so tech-minded so I can't offer any more explanation about some of the more techy terms. Anyway, thanks for reading, and thanks for continuing to stimulate after all these years! It's really nice to see this place continuing to thrive after all of this time!
Um 1750 machte eine Bande Straßenräuber die Bretagne unsicher. Angeführt wurde sie von einer Frau: Marion du Faouët. Was die junge Bandenführerin antrieb, war ein radikaler Gerechtigkeitssinn.
Der Existentialismus ist eine philosophische Richtung und Lebenshaltung: Der Mensch sei zur radikalen Freiheit verurteilt, müsse sein Wesen entwerfen und sich zu diesem Zweck einmischen in die Politik. (BR 2010)
Körperliche Züchtigung war seit dem Mittelalter Bestandteil der "peinlichen Strafen". Im Zuge der Reformen von 1848 wurde zuerst in Preußen die Prügel durch Kerker ersetzt.
Als der Banker Paul Abraham von risikofreudigen Investitionen und einer ausgeprägten Spielsucht zum ersten Mal in den Ruin getrieben wurde, da hatte er noch 'Ernste Musik' geschrieben. Mit Violinkonzert, Serenade und Streichquartett war Paul Abraham auf dem Weg zum 'klassischen' Komponisten. Doch nur fünf Jahre später gehörte er zu den gefragtesten Unterhaltungsmusikern Europas. Autor: Niklas Rudolph
The date was June 10, 2018. The sun was shining, the grass was growing, and the birds were singing. At least, that’s what I assumed. Being a video game and tech obsessed teenager, I was indoors, my eyes glued to my computer monitor like a starving lion spying on a plump gazelle. I was watching the E3 (Electronic Entertainment Expo) 2018 broadcast on twitch.com, a popular streaming website. Video game developers use E3 as an annual opportunity to showcase any upcoming video game projects to the public. So far, the turnout had been disappointing. Much to my disappointment, multiple game developers failed to unveil anything of actual sustenance for an entire two hours. A graphical update here, a bug fix there. Issues that should have been fixed at every game’s initial launch, not a few months after release. Feeling hopeless, I averted my eyes from my computer monitor to check Reddit (a social media app/website) if there were any forum posts that I had yet to see. But then, I heard it. The sound of music composer Mick Gordon’s take on the original “DooM” theme, the awesome combination of metal and electronic music. I looked up at my screen and gasped. Bethesda Softworks and id software had just announced “DOOM: Eternal”, the fifth addition in the “DooM” video game series. “DOOM: Eternal” creative director Hugo Martin promised that the game would feel more powerful than it’s 2016 predecessor, there would be twice as many enemy types, and the doom community would finally get to see “hell on earth”. (Martin) As a fan of “DOOM (2016)”, I was ecstatic. The original “DooM” popularized the “First Person Shooter (FPS)” genre, and I wished I wouldn’t have to wait to experience the most recent entry in the series. “DOOM(1993)” was a graphical landmark when it originally released, yet nowadays it looks extremely dated, especially compared to “DOOM: Eternal”. What advancements in computer technology perpetuated this graphical change? Computers became faster, digital storage increased, and computer peripherals were able to display higher resolution and refresh rates. “DooM” 1993 graphics example: 📷(Doom | Doom Wiki) “DOOM: Eternal” graphics example: 📷 (Bailey) In their video “Evolution Of DOOM”, the video game YouTube Channel “gameranx” says that on December 10, 1993, a file titled “DOOM1_0.zip” was uploaded on the File Transfer Protocol (FTP) server of the University of Wisconsin. This file, two megabytes in size, contained the video game “DooM” created by the game development group “id Software”. (Evolution of DOOM) While not the first game in the “First Person Shooter” (FPS) genre, “DooM” popularized the genre, to the point of any other FPS game being referred to as a “Doom Clone” until the late 1990s. (Doom clones | Doom Wiki) The graphics of the original “DooM” is definitely a major downgrade compared to today’s graphical standards, but keep in mind that the minimum system requirements of “DooM”, according to the article “Doom System Requirements” on gamesystemrequirements.com, was eight megabytes of ram, an Intel Pentium or AMD (Advanced Micro Devices) Athlon 486 processor cycling at sixty-six megahertz or more, and an operating system that was Windows 95 or above. (Doom System Requirements) In case you don’t speak the language of technology (although I hope you learn a thing or two at the end of this essay), the speed and storage capacity is laughable compared to the specifications of today. By 1993, the microprocessor, or CPU (Central Processing Unit) had been active for the past twenty-two years after replacing the integrated circuit in 1971, thanks to the creators of the microprocessor, Robert Noyce and Gordon Moore who were also the founder of CPU manufacturer “Intel”. Gordon Moore also created “Moore’s law”, which states “The number of transistors incorporated in a chip will approximately double every 24 months”. (Moore) Sadly, according to writer and computer builder Steve Blank in his article “The End of More - The Death of Moore’s Law”, this law would end at around 2005, thanks to the basic laws of physics. (Blank) 1993 also marked an important landmark for Intel, who just released the first “Pentium” processor which was capable of a base clock of 60 MHz (megahertz). The term “base clock” refers to the default speed of a CPU. This speed can be adjusted via the user’s specifications, and “MHz” refers to one million cycles per second. A cycle is essentially one or more problems that the computer solves. The more cycles the CPU is running at, the more problems get solved. Intel would continue upgrading their “Pentium” lineup until January 4, 2000 when they would release the “Celeron” processor, with a base clock of 533 MHz. Soon after, on June 19, 2000, rival CPU company AMD would release their “Duron” processor which had a base clock of 600 MHz, with a maximum clock of 1.8 GHz (Gigahertz). One GHz is equal to 1,000 MHz. Intel and AMD had established themselves as the two major CPU companies in the 1970s in Silicon Valley. Both companies had been bitter rivals since then, trading figurative blows in the form of competitive releases, discounts, and “one upmanship” to this day. Moving on to April 21, 2005 when AMD released the first dual-core CPU, the “Athlon 64 X2 3800+”. The notable feature of this CPU, besides a 2.0 GHz base clock and a 3.8 maximum clock, was that it was the first CPU to have two cores. A CPU core is a CPU’s processor. The more cores a CPU has, the more tasks it can perform per cycle, thus maximizing it’s efficiency. Intel wouldn’t respond until January 9, 2006, when they released their dual-core processor, the “Core 2 Duo Processor E6320”, with a base clock of 1.86 GHz. (Computer Processor History) According to tech entrepreneur Linus Sebastian in his YouTube videos “10 Years of Gaming PCs: 2009 - 2014 (Part 1)” and “10 Years of Gaming PCs: 2015 - 2019 (Part 2)”, AMD would have the upper hand over Intel until 2011, when Intel released the “Sandy Bridge” CPU microarchitecture, which was faster and around the same price as AMD’s current competing products. (Sebastian) The article “What is Microarchitecture?” on the website Computer Hope defines microarchitecture as “a hardware implementation of an ISA (instruction set architecture). An ISA is a structure of commands and operations used by software to communicate with hardware. A microarchitecture is the hardware circuitry that implements one particular ISA”. (What is Microarchitecture?) Microarchitecture is also referred to as what generation a CPU belongs to. Intel would continue to dominate the high-end CPU market until 2019, when AMD would “dethrone” Intel with their third generation “Ryzen” CPU lineup. The most notable of which being the “Ryzen 3950x”, which had a total of sixteen cores, thirty-two threads, a base clock of 3.5 GHz, and a maximum clock of 4.7 GHz. (Sebastian) The term “thread” refers to splitting one core into virtual cores, via a process known as “simultaneous multithreading”. Simultaneous multithreading allows one core to perform two tasks at once. What CPU your computer has is extremely influential for how fast your computer can run, but for video games and other types of graphics, there is a special type of processor that is designed specifically for the task of “rendering” (displaying) and generating graphics. This processor unit is known as the graphics processing unit, or “GPU”. The term “GPU” wasn’t used until around 1999, when video cards started to evolve beyond the literal generation of two-dimensional graphics and into the generation of three-dimensional graphics. According to user “Olena” in their article “A Brief History of GPU”, The first GPU was the “GeForce 256”, created by GPU company “Nvidia'' in 1999. Nvidia promoted the GeForce 256 as “A single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second”. (Olena) Unlike the evolution of CPUs, the history of GPUs is more one sided, with AMD playing a game of “catchup” ever since Nvidia overtook AMD in the high-end GPU market in 2013. (Sebastian) Fun fact, GPUs aren’t used only for gaming! In 2010, Nvidia collaborated with Audi to power the dashboards and increase the entertainment and navigation systems in Audi’s cars! (Olena) Much to my (and many other tech enthusiasts), GPUs would increase dramatically in price thanks to the “bitcoin mania” around 2017. This was, according to senior editor Tom Warren in his article “Bitcoin Mania is Hurting PC Gamers By Pushing Up GPU Prices'' on theverge.com, around an 80% increase in price for the same GPU due to stock shortages. (Warren) Just for context, Nvidia’s “flagship” gpu in 2017 was the 1080ti, the finest card of the “pascal” microarchitecture. Fun fact, I have this card. The 1080ti launched for $699, with the specifications of a base clock of 1,481 MHz, a maximum clock of 1,582 MHz, and 11 gigabytes of GDDR5X Vram (Memory that is exclusive to the GPU) according to the box it came in. Compare this to Nvidia’s most recent flagship GPU, the 2080ti of Nvidia’s followup “Turing” microarchitecture, another card I have. This GPU launched in 2019 for $1,199. The 2080ti’s specifications, according to the box it came in included a base clock of 1,350 MHz, a maximum clock of 1,545 MHz, and 11 gigabytes of GDDR6 Vram. A major reason why “DooM” was so popular and genius was how id software developer John Carmack managed to “fake” the three-dimensional graphics without taking up too much processing power, hard drive space, or “RAM” (Random access memory), a specific type of digital storage. According to the article “RAM (Random Access Memory) Definition” on the website TechTerms, Ram is also known as “volatile” memory, because it is much faster than normal storage (which at the time took the form of hard-drive space), and unlike normal storage, only holds data when the computer is turned on. A commonly used analogy is that Ram is the computer’s short-term memory, storing temporary files to be used by programs, while hard-drive storage is the computer’s long term memory. (RAM (Random Access Memory) Definition) As I stated earlier, in 1993, “DooM” required 8 megabytes of ram to run. For some context, as of 2020, “DOOM: Eternal” requires a minimum of 8 gigabytes of DDR4 (more on this later) ram to run, with most gaming machines possessing 16 gigabytes of DDR4 ram. According to tech journalist Scott Thornton in his article “What is DDR (Double Data Rate) Memory and SDRAM Memory”, in 1993, the popular format of ram was “SDRAM”. “SDRAM” stands for “Synchronous Dynamic Random Access Memory”. SDRAM differs from its predecessor, “DRAM” (Dynamic Random Access Memory) by being synchronized with the clock speed of the CPU. DRAM was asynchronous (not synchronized by any external influence), which “posted a problem in organizing data as it comes in so it can be queued for the process it’s associated with”. SDRAM was able to transfer data one time per clock cycle, and it’s replacement in the early 2000s, “DDR SDRAM” (Dual Data Rate Synchronous Dynamic Random Access Memory) was able to transfer data two times per clock cycle. This evolution of ram would continue to this day. In 2003, DDR2 SDRAM was released, able to transfer four pieces of data per clock cycle. In 2007, DDR3 SDRAM was able to transfer eight pieces of data per clock cycle. In 2014, DDR4 SDRAM still was able to transfer eight pieces of data per cycle, but the clock speed had increased by 600 MHz, and the overall power consumption had been reduced from 3.3 volts for the original SDRAM to 1.2 volts for DDR4. (Thornton)The digital size of each “ram stick” (a physical stick of ram that you would insert into your computer) had also increased, from around two megabytes per stick, to up to 128 gigabytes per stick (although this particular option will cost you around $1,000 per stick depending on the manufacturer) in 2020, although the average stick size is 8 gigabytes. For the average computer nowadays, you can insert up to four ram sticks, although for more high-end systems, you can insert up to sixteen or even thirty-two! Rewind back to 1993, where the original “DooM” took up two megabytes of storage, not to be confused with ram. According to tech enthusiast Rex Farrance in their article “Timeline: 50 Years of Hard Drives”, the average computer at this time had around two gigabytes of storage. Storage took the form of magnetic-optical discs, a combination of the previous magnetic discs and optical discs. (Farrance) This format of storage is still in use today, although mainly for large amounts of rarely used data, while data that is commonly used by programs (including the operating system) is put on solid-state drives, or SSDs. According to tech journalist Keith Foote in their article “A Brief History of Data Storage”, SSDs differed from the HDD by being much faster and smaller, storing data on a flash memory chip, not unlike a USB thumb drive. While SSDs had been used as far back as 1950, they wouldn’t find their way into the average gaming machine until the early 2010s. (Foote) A way to think about SSDs is common knowledge. It doesn’t contain every piece of information you know, it just contains what you use on a daily basis. For example, my computer has around 750 gigabytes of storage in SSDs, and around two terabytes of internal HDD storage. On my SSDs, I have my operating system, my favorite programs and games, and any files that I use frequently. On my HDD, I have everything else that I don’t use on a regular basis. “DOOM: Eternal” would release on March 20, 2020, four months after it’s original release date on November 22, 2019. And let me tell you, I was excited. The second my clock turned from 11:59 P.M. to 12:00 A.M., I repeatedly clicked my refresh button, desperately waiting to see the words “Coming March 20” transform into the ever so beautiful and elegant phrase: “Download Now”. At this point in time, I had a monitor that was capable of displaying roughly two-million pixels spread out over it’s 27 inch display panel, at a rate of 240 times a second. Speaking of monitors and displays, according to the article “The Evolution of the Monitor” on the website PCR, at the time of the original “DooM” release, the average monitor was either a CRT (cathode ray tube) monitor, or the newer (and more expensive) LCD (liquid crystal display) monitor. The CRT monitor was first unveiled in 1897 by the German physicist Karl Ferdinand Braun. CRT monitors functioned by colored cathode ray tubes generating an image on a phosphorescent screen. These monitors would have an average resolution of 800 by 600 pixels and a refresh rate of around 30 frames per second. CRT monitors would eventually be replaced by LCD monitors in the late 2000s. LCD monitors functioned by using two pieces of polarized glass with liquid crystal between them. A backlight would shine through the first piece of polarized glass (also known as substrate). Electrical currents would then cause the liquid crystals to adjust how much light passes through to the second substrate, which creates the images that are displayed. (The Evolution of the Monitor) The average resolution would increase to 1920x1080 pixels and the refresh rate would increase to 60 frames a second around 2010. Nowadays, there are high end monitors that are capable of displaying up to 7,680 by 4,320 pixels, and also monitors that are capable of displaying up to 360 frames per second, assuming you have around $1,000 lying around. At long last, it had finished. My 40.02 gigabyte download of “DOOM: Eternal” had finally completed, and oh boy, I was ready to experience this. I ran over to my computer, my beautiful creation sporting 32 gigs of DDR4 ram, an AMD Ryzen 7 “3800x” with a base clock of 3.8 GHz, an Nvidia 2080ti, 750 gigabytes of SSD storage and two terabytes of HDD storage. Finally, after two years of waiting for this, I grabbed my mouse, and moved my cursor over that gorgeous button titled “Launch DOOM: Eternal”. Thanks to multiple advancements in the speed of CPUs, the size of ram and storage, and display resolution and refresh rate, “DooM” had evolved from an archaic, pixelated video game in 1993 into the beautiful, realistic and smooth video game it is today. And personally, I can’t wait to see what the future has in store for us.
Newbs might not know this, but bitcoin recently came out of an intense internal drama. Between July 2015 and August 2017 bitcoin was attacked by external forces who were hoping to destroy the very properties that made bitcoin valuable in the first place. This culminated in the creation of segwit and the UASF (user activated soft fork) movement. The UASF was successful, segwit was added to bitcoin and with that the anti-decentralization side left bitcoin altogether and created their own altcoin called bcash. Bitcoin's price was $2500, soon after segwit was activated the price doubled to $5000 and continued rising until a top of $20000 before correcting to where we are today. During this drama, I took time away from writing open source code to help educate and argue on reddit, twitter and other social media. I came up with a reading list for quickly copypasting things. It may be interesting today for newbs or anyone who wants a history lesson on what exactly happened during those two years when bitcoin's very existence as a decentralized low-trust currency was questioned. Now the fight has essentially been won, I try not to comment on reddit that much anymore. There's nothing left to do except wait for Lightning and similar tech to become mature (or better yet, help code it and test it) In this thread you can learn about block sizes, latency, decentralization, segwit, ASICBOOST, lightning network and all the other issues that were debated endlessly for over two years. So when someone tries to get you to invest in bcash, remind them of the time they supported Bitcoin Unlimited. For more threads like this see UASF
Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
For years, years, I wondered – ‘why me’ – you know, you know, kiddo – ‘why me’ – but there is no ‘why me’. What? As if there were, you know, ‘chosens’, there’s no ‘chosens’ – there’s no all–seeing, all–knowing powerful nothing. It happened. That’s it. I fell for it. I took its bait – hook, line, sinker. Didn’t I do it to myself? Wasn’t I the sucker? There’s no ‘why me’ – and once I realized that, that there was no, that there was no, no any kind of justice what so ever, until I acted, that gave my existence purpose. And now I’m gonna fulfill that purpose. I don’t want you getting involved. You’re deep enough as it is. Don’t be the sucker! – Bobby Mortaren; famous last words
I raced from the house to the hotel, at Walsenburg, where I struggled to make sense of everything that transpired. I poured myself over notes and records that I had brought along. Only my laptop’s glow illuminated the room. Every so often lights through I-25 swept across the bed. Every so often breezes stirred trees around the perimeter. Soon midnight passed. The world darkened, relaxing as it were into slumber. A knock rattled the door - and I could have shrieked if it weren’t for what remained of my nerves. All of a sudden, I felt so icy, so cold, that I stood, frozen, uncertain of how to proceed. Who was it? It couldn’t be good. Not the FBI. Not the Thules. Ache, already? I balked at chucking my laptop - whoever they were at the door, they’d find it, they’d find it. It’s the 21st century; evidence doesn’t vanish without a trace. As my heart pounded my chest, I reached that door and cracked it a notch. I braced for the kick certain to follow. It didn’t come. The hotel’s courtyard / lot spread, deserted except for my rented Wrangler. There wasn’t anyone - anyone who may have been my visitor. Yet - by my feet - at the edge of the threshold - my visitor had left a box. I poked at it with my pole and turned it over and over. It wasn’t postmarked. It wasn’t addressed. It had been delivered by hand and, suspecting what it was, I yanked it inside. Leaning onto and drooping against the door, I tore its lid. The box contained two floppies, a CD, and a stack paper. It was Blue Beelzebub - all of it, every part of it. As well as instructions: a How-To-Guide for destroying your future, fetched onto my doorstep, white-glove-style to boot, as promised. It may as well have been a bomb. ### How did Blue Beelzebub mutate into my obsession? Worse - did I expect to find its truth remarked into code from 1996? 1996! There wasn’t a lot to the internet way, way back when. But crime was crime no matter its era. Was it crime? And did the game start this way or that way then evolve into crime? Was it crime from its start? The programmer of Blue Beelzebub, a hacker by the avatar ‘ZuZu’, claimed to be legit. Their MO had been to create games not scams. Or so it appeared until Blue Beelzebub entered the story. If it were a product of malware, why had ZuZu devoted so much of their effort into its creation? Why had they boasted of the game’s nitty gritty details during its gestation? Why all of that trouble, if only a fraction of it would have been appreciated by those who played it? Even LVN, when they weren’t laundering bitcoin, expressed what may be described as passion for that game. Was it a game? By 1996 standards, its demos parlayed atrocious graphics and threadbare mechanics. The way it affected the player’s rig ensured nobody would be eager to replay it. The game passed every scan available yet it twisted the OS and hijacked the PC to serve as a node, a link into a yet-unknown and yet-unnamed network for purposes every bit as mysterious as the game itself. As I contemplated the reality of the situation, I settled onto the notion that that game may have been a gimmick to cover truly malevolent intentions. That had been the crux of LVN’s KickStarter and GoFundMe rackets - they always proposed plausible if lofty projects as if they were real, actual products people buy. However, case after case demonstrated that their pretense unraveled after scrutiny. Could it be, as far back as 1996, the creator(s) of Blue Beelzebub conceived of such a deception? FPS (of the type Blue Beelzebub reported to be) were the rage through the 90s. If so then their MO resembled that of a typical bait-and-switch scheme - bait them with a game, switch them with a virus. Then? What? Profit? ### In the summer of 2017, Czech authorities in conjunction with the EU, arrested LVN at their apartment south of Plzen. They seized the hacker’s laptop, PC, as well as their twenty thousand CD library. LVN was a hacker-for-fire; evidence presented at their arraignment demonstrated to the court that they had been paid by Russian and other Eastern European actors to pilfer bitcoin wallets. In addition to theft, the court entertained charges connected to a NiceHash heist of 64 million euros earlier that year. It was the breach of NiceHash’s security that brought my skills to the EU’s attention. For a few weeks, between March and May, I played my part to aid the investigation and the conviction of its mastermind. We discovered that the breach had been directed from inside NiceHash. We split the work: ‘brick and mortar’ detectives ran interviews and stakeouts while my fellow ‘white-hats’ and I toiled at the forensics. To meet our end of the bargain, we created a model of that cyber-attack, in order to construct and deconstruct its operation. As we realized how the crime had been executed, we identified the party responsible for it and built the authorities a solid chain-of-evidence - a chain-of-evidence that identified LVN as the perpetrator. LVN masterminded not just that NiceHash heist but a dozen scams at sites like KickStarter and GoFundMe. LVN traded exclusively through bitcoin. Their MO was to sow fake projects then to reap real funds submitted by backers - by backers who aimed to launder money via its exchange into bitcoin. Projects were advertised to those who sought the service; they were fraudulent through and through yet they appeared real enough to fool the maintainers of those sites and the public at large who may have been tricked by the scams. Under the supervision of the investigation at large, I pledged my dollars to a few of LVN’s projects, to see what the response would be. Soon, LVN and I exchanged emails. They wanted to speak face-to-face. In front of the experts, I played to type and gained access to a roster of services from that hacker-for-hire. As a result of the communication, the investigation brought into play anti trafficking & exploiting agencies from around the world and accelerated their goal to convict LVN. One of the projects LVN advertised didn’t fit into the mold in so far as it felt like a genuine hobby of theirs. LVN sought investors to fund their (re)development of a game, Blue Beelzebub. The project listed at KickStarter - removed but saved to my laptop - included a lightbox of images and demos as well as snippets of code. It discussed such esoterics as: updates to its physics engine and its video & audio renderer; upgrades to its arsenal and its gallery of foes; changing its play - expanding its levels and ditching its linearity. The details impressed me as they perplexed me. Why? I kept asking. What’s the idea? What’s the racket? Why create a game using twenty year old technology? I understood its esoterics perfectly for I came of age during the 90s. So much of what went into Blue Beelzebub felt familiar as it was familiar. An FPS - first person shooter - propelled by a fork of that fabled, 2.5D DOOM engine. Little wonder that its caps parlayed the look and feel of classic 90s PC games! Maybe it was yet another scam? Or - maybe - it was a hobby of a gamer / programmer? Could it be that LVN recalled those early DOS games and wanted to re-create the era? But that wasn’t everything. And as I mused & Googled I started to ask myself if there wasn’t more about Blue Beelzebub beyond the haze of my nostalgia. I failed to connect the dots although that did not shake the deja vu - somehow, someway, I recognized that game. ### Escape published my article about LVN’s conviction. Against the advice of my editor, I stalked its commentary, to see what, if anything, the story drew out of the woodwork. Its aside re: Blue Beelzebub attracted attention. I wasn’t surprised, to be honest, as I had inserted it into the text to draw reaction. And my rouse worked! But I wasn’t the only one who felt deja vu about the game. A commentator, who asked for anonymity, posted a link to 4CHAN about Blue Beelzebub. LVN had advertized the KickStarter for the game at a group devoted to indie developers. LVN never advertized their work at 4CHAN out of fear of exposure. So that thread where they didn’t ask for money confirmed my sense that it wasn’t, necessarily, a scam. As I scanned that thread, however, I realized what a rabbit-hole the business would be. After LVN’s post, anonymous replies went to and fro as they typically do. Then the tenor of the thread devolved into a war amongst those who were for vs. those who were against what LVN proposed to do with the game. It was a question about credit. At last - somebody revealed a truth I duly suspected of - that Blue Beelzebub wasn’t the work of LVN - that the game as it existed predated LVN by twenty years or so. The idea for Blue Beelzebub had floated about USENET c. 1995. The majority of the conversations extracted from the archives suggested that the game was vaporware. Its supporters countered that either a P/C or a DEMO existed and that a play-through had been uploaded to (early) YouTube. Everyone who added their opinion - pro & con - agreed that it was “inspired by Satan”, “took its cues from Crowley’s ‘Thelema’“, and that it included clips “replete with ever more corrupt” gore and snuff. A self-described player, whose rig they claimed had been “totaled” by the game, stated bluntly that it contained a “Chinese Sandwich”. Undeterred by the confusion, I kept at my search, ramming through the archives, pushing my way further back in time, from 1997 to 1995. USENET had been mirrored prior to its collapse yet its content was not indexed completely; a robust query of its posts required force and patience.... In spite of the odds, my effort worked, my persistence located the roots of Blue Beelzebub. It was a posted dated June 15, 1995 written by the game’s originator, a hacker by the name of ZuZu. According to their missive, they claimed to have produced “a proof of concept demo” for their “latest and greatest” game, Blue Beelzebub, and that it was “a legit game catering to those who worship and admire Lucifer and everything that stands for”. ZuZu listed, point by point, the substance of their creation. I wasn’t surprised to see, splattered across that post, the verbiage LVN usurped for their own advert. Except - they weren’t seeking funding. According to their missive, the game had been bankrolled “by entities of a foreign sort, who don’t want to be credited”. Rather, they were seeking “experts” willing to alpha & beta test the product. Blue Beelzebub and by extension ZuZu went rouge between 1997 and 2005. Then - October 31, 2005 - ZuZu submitted their last, known public statement. Broadcasted through their usual, over-the-top flamboyance, they wished for their “fans to learn and spread the word” that they “secured an exclusive”. They had convinced a devote of indie horror / FPS games to review Blue Beelzebub. The player they had snagged was famous for their day and their name I recognized as I read it. Bobby Mortaren - an internet pioneer par excellence. Mixing reviews and play-throughs together, his format had been lauded as visionary and just as imitated. Tweaked a bit by-the-by it continued to find use. His name, though, hadn’t been spoken of for a decade. Games had changed. Tastes had changed. He could have shifted into yet another venture so far as I knew. Mortaren posted his works to YouTube - to YouTube prior to its merger with Alphabet. As I considered the changes that transpired across the years, I wasn’t surprised to discover that all of my links to his works were dead. Eerily, though, it was impossible to locate his reviews directly via YouTube. So I tried Google and Bing. No result. Ditto with DuckDuckGo. Ditto with Wiki, SlideShare, BoardReader. Out of desperation I surfed into the remnants of Alta Vista - maybe its database saved the information? No. No. Futile - all of it. YouTube’s size was greater than USENET’s size. My task’s extent was altogether a colossal order of magnitude. If that which I pursued had not been deleted, then, it would be found ad finem omnia. So to dig further I opted for a quick & dirty hack - a bot. A bot scripted to sift and sort all YouTube’s content that matched keywords Mortaren and Blue Beelzebub. I ran it and waited for days then for weeks then for months. ### My extensive search corroborated the fact that Mortaren left the internet c. 2006. Assuming they may have continued via pseudonym, I enquired into the matter with colleagues who devoted themselves to games and / or to reviews. Only a few recognized their name; nobody was cognizant of their voice. An editor from ToplessRobot directed my attention to a defunct fansite’s messageboard where somebody asked why Mortaren vanished without a trace. To my shock, the reply was that Mortaren had been arrested by the FBI c. 2006. I could not fathom why. Nevertheless, if the revelation were correct, then, the resolution to the matter was tantalizingly viable. Arrests - and trials - were public. The LVN / EU case brought my forensic skills to the notice of the DOJ and the Treasury / Secret Service. The FBI, like its European counterparts, wanted to understand everything about bitcoin and how it might (might) be possible to trace transactions to individuals. As part of my freelance work, I already met and debriefed FBI agents re: the Czech hacker. Eventually ‘large’ talk gave way to ‘small’ talk amongst us. It was at that juncture that I broached the subject of Blue Beelzebub - namely, that LVN hatched a scheme to defraud investors (via bitcoin) ostensibly by promising to develop an update to that game. “They got exposed by players who recognized the game’s ill-repute,” I stated. “Apparently, the game’s infamy started after its reviewer, a fellow by the name of - er - Robby Mortaren? Bobby Mortaren? Well - they got arrested by the FBI.” Neither the game nor the reviewer elicited a reply - immediately, anyhow. A (censored) document, summarizing a DOJ investigation, worked its way into my mailbox. Mortaren had been under FBI surveillance from November 2005 to May 2006. Why wasn’t stated; just that the FBI obtained search warrants for computers & electronics. A federal judge issued an arrest warrant May 30, 2006; however, the DOJ withdrew the charges after Mortaren agreed to an immunity deal. Mortaren turned star witness at a trial that involved organized crime as well as rackets, cults, ritualized human & civil rights abuses and elements that suggested Satanism. The perpetrator(s) that the DOJ wanted to convict fled either to South America OR Eastern Europe / Central Asia. The trial evaporated; neither the charges nor the perpetrator(s) were detailed. Mortaren’s immunity deal with the DOJ wasn’t negotiable or retractable and included a complete internet ban. The document listed a PO BOX as Mortaren’s permanent address.
To Mr. B. Mortaren: Sir, I apologize. Blue Beelzebub. Were it not for the fact that you may be the only person left to recall that game, I would not have stretched my resources so thin to find you. If you are not able to assist my research, is anyone? I was part of an EU investigation re: bitcoin, theft & fraud, as well as trafficking & exploiting vagrants. Through that investigation I came into contact with a hacker; they claimed to be working on Blue Beelzebub; they sought funds to upgrade it. While disturbing to say the least, that game did not strike me as part of the hacker’s MO. So I pried further into the matter and discovered, to my astonishment, that Blue Beelzebub dated to the mid 90s and that you reviewed & posted the demo at YouTube. I am curious about that game. I cannot get it out of my head. Who was the programmer? Who was the developer? Where did they get the money? What were their goals? What was the game about, if the game was about anything? A DOJ document summarizing your immunity from prosecution was brought to my attention. I suspected, as I matched the timeframe of the FBI’s surveillance and arrest, to the demo, that these matters are related. I was not able to find a link, due to the fact that all records, transcripts, etc., were sealed by request of the FBI. If, for any reason what so ever, we cannot communicate about this matter, would it be possible to contact a surrogate or anybody with the information I seek? With All Due Respect JK
### Due to limits that existed at YouTube’s debut, videos posted from 2005 to 2010 were capped to 10 minutes. Both image and sound playback quality were kept low to spare bandwidth. A lack of (accessible) software and hardware to edit video forced vloggers to improvise. Mortaren had always used a webcam and mic from the 90s to shot their videos ‘live’, i.e., without edits. YouTube retained the majority of Mortaren’s content; however, after a check of the dates and the poster’s IDs, I determined that Mortaren’s videos had been reposted c. 2006 by another user. If the titles / numbers were correct then there were seven parts to the demo Mortaren recorded for Blue Beelzebub. Of seven, six remained. Specifically, the 5ifth - which must have been filmed as evidenced by the discontinuity between 4ourth and 6ixth - defied my ability to trace. The reposter stated that “the 5ifth wasn’t part of the review package”. Yet, as I perused copies of replies they had saved, commentary that referenced material that doesn’t appear anywhere else, I strongly suspected that a 5ifth had been posted for a while and, for whatever reason, Mortaren removed it prior to 2006. 1irst - details facts re: the game: the developer, the programmer, the system requirements, etc. “If your rig’s able to run DOOM, Blue Beelzebub works,” they state then add: “although, prepare yourselves, kiddos, the game takes a very, very long time to install”. Passingly, he adds that a fan of his had ditched the game after they experienced “a catastrophic system failure” that they blamed “on either a bug or a virus or both”. The executable and its auxiliary files pass every virus and malware checker Mortaren throws at it. 2econd & 3hird - demonstrates the game play or what passes for it. Mortaren prefers to record his reviews live so that his fans experience the game exactly as he does. His videos contain hints / cheats if they are discovered as he plays. He describes Blue Beelzebub as a DOOM-GUY-ESQUE player who moves through an enshadowed monochromatic maze. “There’s no backwards, I, I, I don’t believe it! Did they forget to give us backwards? There’s forwards and left, right. Kiddos, you gotta do a circle to go backwards.” He continues to berate the game, adding: “Yeah, there’s only forwards. And you know, I gotta say it, the programmer may think they’re the money’s nuts for it.... But it’s so weird that going forwards causes the view to bob up and down or side to side. What’re they trying to do? Are they trying to replicate a player’s gait? Takes me right out of the game. Let me tell y’all why. Like I said, the programmer’s got to be thinking they’re the monkey’s nuts but it’s that bizarro attention to detail that’s so jarring as I consider the lack of detail given to the graphics. Guys. Guys. Guys. You gotta think about what you present.” Mortaren piles his criticism of the graphics and the sounds, comparing both unfavorably to DOOM. Especially frustrating is the invariance of the black & white textures throughout the maze. He praises the response of the maze to the player as he notes, while attempting to draw the maze, that its passages shift at random. Then more and more criticisms were strewn at the game, including its lack of weaponry, its lack of powerups / extras, its lack of anything. “A game can’t be about going through the maze, guys, there’s got to be a point - something to do!” Finally, he voices the suspicion that he had been duped by ZuZu. 4ourth - the demo gets interesting. Mortaren finds an area of the maze where the textures differ. The video’s pixilation - perhaps due to the webcam - perhaps due to the way the reposter preserved it - masks the bulk of the alteration. I detect a change of shade, though, from black & white to blue. “Well it can’t be for nothing that the wall is blue. Jeez!” As he cracks the joke, to his shock (an explicative slips), the sounds became those of “eerie, drone-like notes fading into reverb” and the monitor displays a still-shot. Mortaren zooms into the image; I recognize it as coming from the shock-site, ROTTEN. After that alteration, every blue-hued texture Mortaren faces produces other images, increasingly nihilistic and graphic, usually of the dead or the dying, often of celebrities, suicides, accidents, wrecks. 5ifth - ? 6ixth - the segment starts at an awkward jump. It must have been split from the 5ifth video and while Mortaren does not state why, explicitly, the tone of the voice suggests that something serious transpired. “Sorry, kiddos, I turned the webcam away - a first - I guess this ZuZu accomplished something.” When he returns the webcam to the monitor, it is apparent that in addition to tone the substance of the game itself altered. The player stands at the center of a room Mortaren describes as “a vault with a hole at its floor”. The 2.5D renderer prevents the player from gazing inside the hole. But by directing the player to walk the hole’s circumference it is possible to catch bits of its contents. A sharp, blue light shoots out of the hole; the way it cast light at the ceiling suggests there might have been “water”, as if the hole were a well of sorts. What shocks Mortaren is that the room fills with children. The renderings of faces make each of the children unique. However: “the ghastliness of the imagery resembles how faces voxilate like with Delta Force games”. Further, he notes, after a pause that echoes my own consternation and trepidation, “I’ve seen these kids. Yeah, I’ve seen these kids from those, those photographs the game stopped everything to show us. Jeez!” The children stand statue-like as the player walks about them. They serve as obstacles that block movement, otherwise, inert, unresponsive, “not that the player interacts with the kids as there’s no other keys available except A, W, D”. The video continues, then, Mortaren shrieks. The playback jostles as if it were about to stop. When everything resettles, he speaks, calmly and evenly, that “there’s a kid that’s different ... animated. You gotta see it, kiddos, I can’t say if it’s awful because it’s awful or if it’s awful because it’s awful....” The webcam zooms into the monitor; the child rendering appears to show it breathing, haphazardly, with their mouth agape. And then, then the child moves and the player like the viewer alike slip an explicative. “I take it back, everything, this is truly and utterly awful.” 7eventh - the coda feels like the set’s longest but is the shortest. “Right now I’m running. I don’t have a weapon, jeez! I’m running as fast as this keyboard allows but my health is shrinking.” Mortaren stops and rotates the player to face backwards. The animated child is behind and striking the player using a technique that resembles “Hanna-Barbera laziness - or who knows - who knows, kiddos, it could be part of the style”. Just as it is with DOOM, as the player’s health decreases, the view gets redder and the avatar gets bloodier. Mortaren aims into the maze; there is no exit, there is no weapon, no upgrade to assist, all that exists is the floor where the player drops, dead. The 7eventh adds a post-script recorded after the demo. It shows Mortaren’s PC, open and split to pieces. “The game installed a virus,” he declared then described its symptoms. “Immediately upon my player’s death, the PC rebooted. After the BIOS, instead of going into DOS, it starts a telnet session and tries to connect via IP. Of course it doesn’t get a reply since my PC uses dial-up. So it freezes, pinging and pinging a server somewhere that it cannot reach.” Mortaren concludes by theorizing that if Blue Beelzebub were a virus, it must have been designed to target high-end systems with LAN / Ethernet ports. I jot the IP and attempt to connect to it. Strangely, it will not load yet it will not issue an error of any kind. Chrome, FireFox, Edge, etc., freeze. WHOIS is not able to resolve the owner. Nevertheless, it yields the location of the server, a site approximately 50 miles north east of Trinidad, Colorado. I reject the result; users of tracers already know that they rely on ISP databases to match IP / location - and how often are those databases updated? - and how often are those updates distributed? The decade that passed between today and the video, and between the video and the creation, assures that there must have been a drift re: the location of the IP. ### I will not reveal the particulars of when, where, and how I received the call. “The coordinates.” Into my ear spoke a voice that my investigation made familiar. “Check the coordinates.” “Coordinates?” “Blue Beelzebub.” “Yes,” I replied and Mortaren implied we’d meet. Mortaren had traced my whereabouts through the blogosphere. He wanted to talk about the game yet feared the government “and or others” eavesdropping. I admitted off-handedly that as I sunk into my work with the DOJ, my paranoia tipped. “What’s the deal with the game, anyway?” “What do you want on your Chinese Sandwich?” My impression settled onto a mixture of intrigue and trepidation. The matter felt so cryptic as to defy credulity. Coordinates? Blue Beelzebub. Chinese Sandwich? Nevertheless, even as we talked (brief as the conversation was) I put together that by coordinates + Blue Beelzebub Mortaren referred to the IP the game telnet’ed.
Newbs might not know this, but bitcoin recently came out of an intense internal drama. Between July 2015 and August 2017 bitcoin was attacked by external forces who were hoping to destroy the very properties that made bitcoin valuable in the first place. This culminated in the creation of segwit and the UASF (user activated soft fork) movement. The UASF was successful, segwit was added to bitcoin and with that the anti-decentralization side left bitcoin altogether and created their own altcoin called bcash. Bitcoin's price was $2500, soon after segwit was activated the price doubled to $5000 and continued rising until here we are today at $15000. During this drama, I took time away from writing open source code to help educate and argue on reddit, twitter and other social media. I came up with a reading list for quickly copypasting things. It may be interesting today for newbs or anyone who wants a history lesson on what exactly happened during those two years when bitcoin's very existence as a decentralized low-trust currency was questioned. Now the fight has essentially been won, I try not to comment on reddit that much anymore. There's nothing left to do except wait for Lightning and similar tech to become mature (or better yet, help code it and test it) In this thread you can learn about block sizes, latency, decentralization, segwit, ASICBOOST, lightning network and all the other issues that were debated endlessly for over two years. So when someone tries to get you to invest in bcash, remind them of the time they supported Bitcoin Unlimited.
Bitcoin: A Peer-to-Peer Electronic Cash System Satoshi Nakamoto [email protected] www.bitcoin.org Abstract. A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution. Digital signatures provide part of the solution, but the main benefits are lost if a trusted third party is still ... Bitcoin is an internet based crypto currency introduced in the world with the article "Bitcoin: Peer-to- Peer Electronic Payment System" published by a person or group using the name of Satoshi ... The resultant bitcoin user and market view is muddled: From 2010 through Scaling Bitcoin:Montreal, it appeared that the core block size would see an increase. Following Scaling Bitcoin:Hong Kong ... Digital money that’s instant, private, and free from bank fees. Download our official wallet app and start using Bitcoin today. Read news, start mining, and buy BTC or BCH. 1. Introduction. The digital currency Bitcoin was proposed in 2008 by Nakamoto as an anonymous alternative to the centralized banking system. Since then, Bitcoin enjoys widespread adoption, e.g., in July 2015 the market cap of circulating Bitcoins exceeded $4 Billion .To keep track of the balances and to establish trust in the currency, all Bitcoin transactions are stored in a distributed ...
ZUR SACHE: Migration und Markt. Wer bezahlt, wer profitiert?
Onecoin promised the world, but only proved to be a trail of destruction. --- About ColdFusion --- ColdFusion is an Australian based online media company ind... Ram Trucks engineer Chris Borczon explains how to install LED lights to the rear of the Ram Chassis Cab in this video from the Ram Trucks Commercial Vehicle ... Ram Trucks engineer Chris Borczon gives us a crash course on rear truck wiring for Ram pickups and Chassis Cabs. For upfit information not included in this v... Description Ram Trucks engineer Chris Borczon discusses auxiliary switches in Ram pickup trucks and chassis cabs, focusing specifically on how auxiliary swit... Here is the link to this Month's Astrology Forecast where I speak about Bitcoin and ... about it and considered it back in 2010. It was in 2018 that the space became saturated with people talking ...