When most people think of videogames, they either think of first-person shooters, anything from the Mario or Zelda franchise, or Minecraft. It's instinctive, I used to think like that. These are probably some of the most popular examples with the general public, but a completely different mold of game emerged near the turn of the millennium experience an explosion of popularity. Games that fit this category could be found in multiple genres, although they were by far the most numerous within the genere of RPGs (lit. Role-Playing Games). I'm not talking about an art style, or a new hardware platform, I am referring to games that transcended the bounds of regular games and have become a storytelling medium. Literature is a wonderful format for storytelling, but it has its weaknesses. Literature can be very engaging, but compared to film it lacks as it lacks visuals in addition to be more taxing on our senses, eyes in this case. Feature-length films are often filled with suspense and evoke strong emotions such as awe, fear, sadness, and amusement. Feature-length films are instead limited not by the ease of delivery but instead by the limit of length, which is 1-3 hours. Television series fix this by having long runtimes that are broken up into small segments, that you can ingest at your leisure. Everything I described so far has been objective, and while I cannot find any objective improvement over television series, I can put forth a situational weakness: you are a passive observer entirely removed from the narrative. Using videogames, a player can be put directly into the action, where their decisions have a direct impact on the progression of the story. In rare cases they can even change the final outcome! The main character isn't the one exploring the temple, you are. You are the one driving the car, you are the one fighting the bad guys, you are the one meeting a god, you are the one saving the world. In videogames, you are the center of the show. Videogames also tend to have longer playtimes than television shows, particularly shows made for streaming platforms as those tend to be much shorter. Playtimes can often get in excess of 50 sometimes 100 hours. While some of this time is spent bumbling around and wasting time, a well-designed game will draw players in so that they can't forward the plot fast enough. I know this feeling very well, this is what I felt like playing Xenoblade Chronicles 2 or Final Fantasy XV. Modern games have the additional advantage of being designed for powerful modern hardware, which allows numerous graphical effects to be generated in real-time that rival big studio CGI movies. Cutscenes can elevate this to a new level, as a prerendered cutscene is not bound by the player's hardware, allowing anyone to enjoy it. Even if you can't afford the best computer parts, you can still enjoy some games on a regular computer if you dial down the settings. Alternatively, you can buy a console for $2-500. Microsoft's Xbox, for example, comes in a $500 version (Xbox Series X) and a $300 version (Xbox Series S). Although the more expensive one is more powerful, it does not make too much of a difference as games are designed for the baseline system and additional improvements, while usually significant, are often not profound. Your returns diminish further when you compare a $500 Xbox Series X to a high end gaming PC, which will cost between $1200 and $2000. Yes, the PC version will look better, but the differences are usually mild.
Additionally, modern story-driven games are usually, but not always, designed so that you can complete the story regardless of your skill level. This is implemented through a variety of techniques such as level scaling, adjustable difficulty, strategically-placed predetermined encounters, or simply abolishment of the leveling system entirely. As is true for literature and film, narrative-driven videogames are not for everyone. Narrative-driven games are for people who want a masterful story. Some people think cartoons are only for children (they're dead wrong, watch Avatar). Similarly, some people dismiss videogames as childish, or say they'll never be any good so why bother. I believe, that the most technologically inept, even technophobic people can and will enjoy videogames if they have the the guts to jump in headfirst and give it an honest go. Yes, unless you want to play visually unappealing games you will likely need at least a basic console, which will out you $300 new or $1-200 for an old one, plus around $40-70 for a game. This entry barrier can be softened with services like PlayStation Plus, PlayStation Now, and Xbox Game Pass ($10-15). These are Netflix-style subscriptions that some way or another give you access to tons of games for a low monthly cost. Xbox All-Access takes this a step farther by giving you a console and Game Pass together for a monthly fee ($25-35). A comparison could be drawn to reading The Lord of the Rings for the first time. It is a very intimidating endeavor due to the size of the volumes as well as their advanced vocabulary, however once you read them you will be astounded by their excellence. What are you waiting for? Go on, try a game! If you need suggestions ask a gamer friend, look up best story-driven games, or check out my list of best games for players new and old. There's plenty to choose from!
0 Comments
My first real encounter with embedded computers was all the way back in 2015 on a Raspberry Pi 2B, which packed a blazing quad-core 900MHz CPU and a whole gigabyte of LPDDR2 RAM. I remember booting up Raspbian Wheezy and playing games like "Squirrel eat Squirrel" and "Minecraft: Pi Edition". I also wrote my very first "Hello World!" program on that Pi. I used that Pi 2B to learn about programming fundamentals like variables, if-else statements, while loops, for loops, input statements, lists, input, and functions. Despite their simplicity, I found it thrilling to write these programs. I derived joy from having stupidly long for loops or naming my variables after memes.
I remember the CPU would get warm or sometimes even hot to the touch, so I put some heatsinks on it because I thought it'd run faster. I didn't know it was too simple to throttle and what I thought was hot was barely breaking a sweat. It was fascinating to see a whole computer on such a tiny board that was so cheap, I was blown away! Sure, it wasn't as powerful as a laptop or desktop, but Raspbian was so light and my programs were so simple it didn't matter. Since the original Raspberry Pi was so tied on RAM and had such a weak CPU, Chromium didn't come with Raspbian, but when Raspbian PIXEL came it was totally game-changing: Chromium was available on the Pi! The Pi was now a real desktop replacement, it was probably almost as gutsy as my Chromebook, which is to say not very. Nevertheless, it seemed magical to me. The Raspberry Pi was my gateway system, it helped me learn to program, it introduced me to embedded systems, and it taught me computers can be fun. Nowadays, Raspberry Pis are way more powerful than my old 2B and can come in much smaller sizes, but they haven't fundamentally changed. Besides, the old 2B is still great for applications with limited thermal headroom, every Pi has its place. This summer I'm going to make some firebending gloves, and to do that I'm going to use either a Raspberry Pi Zero or Raspberry Pi Pico to read an accelerometer and control a valve to release bursts of fuel. Both of these Pis are even weaker than my old 2B and pack way less memory, but that's not a problem. They're extremely cheap and have very low power consumption, which is why I'll use them. Kudos to the Raspberry Pi foundation for making coding and embedded tech accessible to anyone, my life wouldn't be the same if I hadn't first booted up my Raspberry Pi 2B. I think everyone should give a Raspberry Pi a shot, whether you want to animate a cool project, learn to program, build your own smart home, or even just give them a shot as a desktop computer, you never know what you'll find out! Look out for more Pi-related articles in the future. We're right in the middle of Nvidia's GPU Technology Conference, GTC, and I recently learned about something amazing: the RAPIDS software suite. RAPIDS is a set of GPU-accelerated Python libraries including CuPy, cuML, cuDF, Dask, and cuGraph, meant to replace the popular data science libraries Numpy, Scikit-Learn, Pandas, and Matplotlib. The RAPIDS libraries are mostly drop-in replacements, and support many of their competition's core function. RAPIDS is revolutionary because it uses Apache Arrow to limit the number of transfers between system and GPU memory, which is one of the primary weaknesses of GPU-acceleration due to the latency induced by the PCI-e bus.
I worked through some tutorials, and witnessed between 4x and 10,000x speedups on various tasks involving a million to a billion data points versus identical CPU-only operations. I already had experience using CuPy, but with a mere ten thousand data points I wasn't anywhere near parity with CPU code. What I didn't know is that with cuDF or Dask, you can mix datatypes and even process strings! Turing SMs are capable of processing INT32 at the same time as float32, and even some float64 with a massive performance hit if it must be done. This lets them take full advantage of RAPIDS mixed datatypes. RAPIDS even allows you to do quick and efficient K-means clustering as well as DBSCAN depending on whether or not you know how many clusters your data should fit into. Network topology problems? RAPIDS can chew 'em up and spit 'em out like nothing. the same underlying CUDA primitives can even be used to accelerate databases with BlazingSQL. You really can breath life into old hardware with only a GPU! Back in 2005, a new type of processor was prototyped with the goal of powering the most advanced console ever seen as well as the next-generation supercomputers: the Cell Broadband Engine. The Cell Broadband Engine, also known as the CBE or Cell, was a revolutionary design made by STI, an alliance formed by Sony, Toshiba, and IBM with the goal of building the processor of the future. Their design took a single high-performance IBM PowerPC core clocked at 3.2GHz with support for two simultaneous threads, and put it on the same die as eight special cores designed with a single goal: to execute numerical calculations as fast as possible. The result was the Cell Broadband Engine, a heterogeneous architecture processor that used a single CPU core, the Power Processing Element or PPE, for general purpose tasks as well as supplying data to the eight Synergistic Processing Elements, which provided supercomputer-level performance at raw number crunching.
The resulting chip was used to power the PlayStation 3, coupled with a respectable 256MB of blazing-fast XDR RAM as well as the Nvidia RSX "Reality Synthesizer". The other specs were impressive, but the Cell was the real star of the show. After the PlayStation 3 debuted on November 11, 2006, IBM set its sights on something bigger: making an improved Cell processor to power the next generation of servers. The resulting chip, the PowerXCell8i, was used in the IBM BladeCenter QS22 server as well as the IBM Roadrunner, the first petascale supercomputer. It looked like the future had arrived. Except, in 2013 the IBM Roadrunner was dismantled and the PlayStation 4 launched with a conventional x86_64 processor. No new developments after the PowerXCell8i were made to the Cell Broadband Engine. Why was this? My theory is based upon one of the PlayStation 3's shortcomings as well as some digging I did into IBM documentation: the Cell Broadband Engine created a positively garish experience for developers, and Nvidia's CUDA-enabled GPUs were faster and able to be integrated with existing systems. In addition to the daunting obstacles Cell's API posed to developers, Cell also received limited adoption and was only available in high-end servers and the RAM-tied PS3, neither of which was suitable for the masses. In this sense it was inevitable that CUDA eclipsed Cell, it was accessible to anyone with a computer and $350 to buy an Nvidia GeForce 8800 GT, which was almost a match for the Cell in raw throughput and had double the memory of the PS3. Since then CUDA took off and we never looked back. |
DanielI'm a software engineer, volunteer IT support, amateur blogger, casual gamer, and tech enthusiast. I also love cars and the great outdoors. Archives
May 2021
Categories |