PC builder’s primer

I have helped a lot of people of the years.  One thing I really enjoy is to help people learn to build PCs, to help them pick out the best parts and components for their use, wants, needs and budgets.   I’ve figured out some very basic things that anyone can use to help make decisions regarding their new PC.  These decisions can be used for self-built or professionally custom built PCs, or to decide if an off the shelf computer would be better.  Here I make little distinction between types of computers, be it a desktop, laptop, server, matchbook computer, etc.

The very first thing anyone looking to build or buy a new system is the budget they wish to have for it.  Ideally, that budget should be cash in hand, or capable of being so.   In enterprise situations, some companies allow for loans, pay over time, leases, etc.  We’re not going to discuss these payment methods here – if you’re at that level, you should already have a good idea of what’s suitable for your company.

If you’re “cash strapped” “broke” or otherwise unable to afford all the parts, components, or systems right away, consider the importance of spending your limited funds on something you wouldn’t be able to use until you spend even more.  With that said, it’s quite possible to piece meal a system over time.  It is very important that the best parts are purchased, as they will age and by the time the system is bootable, it may well be outdated.   At this point, if you’re unsure if you can afford this, then continue reading for future knowledge, but I highly advise not ruining your life for any material luxuries.

Overview of decisions:
  1) Budget – as mentioned above.  This is most important for most.
2) Form-factor – The physical size and aspects of the device.
3) Architecture – The type of system, specifically based on the CPU
4) Primary use – What does this system have to do?
5) Secondary use – What else should this system be able to do?
6) Environmental conditions – Where will this device live?
7) OS (Operating System) – There’s 3 primary OS types, & many alternative OSs.
8) Number of displays / outputs – One of the more important decisions.
9) Storage – type, capacity, speed
10) Comfort – What is the usability of this system?

Budget:
  As I said, budget is probably the most important concern with building a new system.  This is ultimately a personal issue, and here is just advice for consideration.

Everyone would love a $10,000 system – not everyone needs one however.  Some only have $100 to spend at a time, and that’s OK too.  Most basic systems can be built for less than $1,000 however.  These systems can be expanded and upgraded later.  Depending on the upgrade or expansion, some items may need to be completely replaced.  This should be considered when budgeting.

If piece-meal building, it is best to purchase the components which are somewhat future-proof.  One of those components is the case.  If building a standard PC, the case used on PCs from the 90s are still suitable for PC builds today, because they are designed around the ATX standards.  More on that later.  The power supply (PSU) is also something that’s more or less future proof, provided a reasonable wattage is purchased, and the future system will use less than that wattage.   RAM, the specs change often, however as long as the RAM purchased is the same generation (DDR3, DDR4, etc) of what will be used in future systems, it can be used.  RAM speed may be a limitation, but as long as it’s the same gen, it’ll work until you can afford to upgrade it.  Hard drives, especially SATA drives still have many years of usability ahead of them.  However, M.2 NVMe drives are the future.   These are all things to keep in mind when building a system, the wrong component purchased now can quickly become outmoded and unusable, becoming a waste of money.

Form-factor:
There are many many form-factors, some are decades old standards that won’t be going away any time soon, and some are new standards that may survive whilst others are completely proprietary and won’t be usable with standard components.

Phones, Phablets, Tablets, wearables, etc are all forms of proprietary form-factors.  There has been a few attempts at creating modular standards for phones, however these have not come into fruition.

Ancient, outmoded standards include the AT form factor, as well as other forms such as IBM’s original PC, Sparc desktops, etc.  Unless you’re an enthusiast, hobbyist, etc – stay away from these.

Some of the new standards which are available, some being more proprietary than others are in the matchbook or palm-size computer field.  This includes things like the Raspberry Pi. Many of these use the pico-ITX standard, either in full or in part. Compatibility isn’t guaranteed.

The more standard PC form-factors are based on the original AT standard.  These include ITX (pico-ITX, nano-ITX, mini-ITX) DTX (mini-DTX, DTX) BTX (pico-BTX, nano-BTX, micro-BTX, BTX) (Note, these standards have mostly fallen out of favor; additionally first gen BTX cases were simply “upside down” ATX cases, later BTX were completely different however)

Finally, we have ATX – the tried and true form-factor, for which most current (and more than likely future) desktops, workstations, workgroup servers, and DIY rack-mount servers are based upon.  This includes Flex-ATX, micro-ATX (aka μATX, uATX, mATX), ATX, EATX, and many purpose-specific variations.
[note: All of the above standards are loosely listed by physical size]

These standards are the basis of motherboard and pc case dimensions, and mounting hole positioning.  Nothing in these standards are exclamation of the intended purpose of the end product.  It is possible to build a high availability server using a mini-ITX motherboard and case; or to build a Gaming rig with an EATX board and case.  The component quality, component function and OS generally dictate the use of a system.

It is also not only possible, but very common for a smaller motherboard to be used in a larger case, such as a mini-ITX motherboard being mounted in an EATX case.  Conversely, a larger motherboard cannot fit within a smaller case.  Though some cases may be the physical size of an EATX case, it may be designed as an ATX case, and as such an EATX motherboard is not usable – even if it can physically fit.  Forcing this can cause potential hazards to the components, even electrocution and fire hazards.

Generally, the vast majority of home PCs are based on μATX and ATX.  Newer desktops produced by name brand manufacturers are also building “SFF” PCs based on Flex-ATX and mini-ATX.

Some of the major applicable differences between these standards is the number of RAM slots available for the standard, and the number of expansion slots (PCI, PCIe, and now M.2) available for the standard.  In these more common standards, there is space allocated for 1 to 6 RAM slots, though not all slots may be installed.  As for expansion slots, there is the ability to have 1 to 7 slots.  With the advent of M.2 NVMe storage, some PCIe slots are removed to make way for the M.2 slot.  More niche standards include up to 10 PCIe slots, and up to 16 (or more) RAM slots.  These standards are out of scope for this discussion, mentioned only for completeness.

The form-factor standards do not necessarily specify other aspects of the case or motherboard.  One of the more important components which are directly affected by these standards is the power supply unit (PSU) – with ATX PSUs being usable in most cases compatible with mATX or larger.  ATX PSUs are the most standard, as such are much more readily available, and usually less expensive per-watt.

Architecture:
This is the logical specification the system is based around.  There’s some confusion in various media sites online as to what constitutes an architecture.  Here, we’re going to focus on the common implemented  ISAs (instruction set architecture)  There are 2 more common, modern, in-use ISA types: CISC and RISC.  CISC is “Complex Instruction Set Computer” while RISC is “Reduced Instruction Set Computer”  There are several ISAs based on CISC and RISC.  Unless you’re doing something very specific with a lot of knowledge, these decisions boil down to x86 or Arm (Previously written as ARM, forgive me if I mistakenly use “ARM” in lieu of “Arm”)

Arm (Advanced RISC Machine) is an ISA based on RISC, and is the most common processor in use on the planet.  Surprised?  Arm CPUs are used in the lion’s share of tablets, phones, wearables, embedded systems (such as cable boxes and DVRs), and if Apple holds true, in their line of desktop computers via their A- line of processors in use in their mobile devices.  There are additionally other companies which have been producing ARM based laptops and desktops for several years.  These systems have caught on only with hobbyists, in academia, and other niche markets.  There are many, many implementations of Arm CPUs, and not all of them are compatible with each other due to proprietary extensions to the ISA, which can be highly relied upon by the OS or software to function.  Very basic software written to use the base Arm ISA can, however, be used on other ARM CPUs.  However, the OS still needs to be able to load the software, and such incompatibilities may persist.

The most common ISA in use in “standard” computers, from laptops to servers, is the x86 ISA.   This ISA started in the late 70s with the Intel 8086.  The early successor to this is the i386 set, used in the Intel 80386, Pentium, Pentium II and Pentium III processors.  To this day, processors based on x86 are technically still compatible with the 16bit i386 set.  In reality, it is quite difficult to bootstrap 16bit code on modern CPUs.  Currently, the most common implementations of the x86 ISA is AMD64.  This is named after the extensions to the x86 specification AMD pioneered (or at least brought to market before Intel’s 64-bit specs)  Today, all 64-bit x86 processors are based on the same AMD64 extensions, including Intel CPUs.  Intel, via their Itanium products, had a 64-bit CPU before AMD even thought to endeavor into 64-bit land.  Though Intel created these CPUs, they were not compatible with x86, and are identified as Itanium-64, or less commonly IA-64 and x64, with 32-bit variations have been referred to as IA-32.   Over the years, confusion has been created by under-educated enthusiasts and technicians referring to AMD64 extended ISAs as x64, and generally the term refers to 64-bit x86, properly  written as x86-64.

The decision here is thus a three-way decision: x86-64, Arm, or go off the deep end and go with one of the plethora of alternatives.  The most readily available CPUs are a stand-alone product come from Intel and AMD.  The most common Arm CPUs as stand-alone products require a LOT more work to build a system, and so should be purchased as part of a mainboard.  There are very few ATX-compatible Arm based motherboards on the market, and those have the Arm CPU included, usually soldered on.  The most common ARM based systems are phones, tablets, and other mobile and embedded systems, including the Raspberry Pi, and (soon?) Apple Macintosh desktop computers.

The only real option for the every-day man to build a PC comes in the form of x86-64.  So, thus, the decision becomes a choice between Intel and AMD CPUs.  Anything else requires a large amount of investment in money, time, or knowledge, and probably a fair amount of all three.

Primary Use:
If you’re building a computer to run Windows or standard distributions of GNU/Linux, you’re probably going to want to go with x86-64.  However, if you’re wanting to build something more purpose-specific, such as a Vehicle PC, a Raspberry Pi or similar may be more suitable.  The primary use has a large amount of weight in deciding upon architecture, CPU, form-factor, etc.

In a “expense is no issue” world, throwing all your money at this would give you a system you can use for almost anything.  Almost anything.  Specific directions taken can eliminate other potential, specific uses.  That’s likely not to be an issue though.

There’s several common primary uses for PCs.  These include Gaming, Development, Design, Art and Rendering, heavy office work (such as massive spreadsheets, stock trading), light office duty with single or limited application use (such as scheduling software), light home use, and then more general purpose / heavy home use.   We can boil these down to being “low end” “high end” “gaming” “enthusiast” and “scientific/academic” with variables being “low power”, “high reliability” and “specialized”

Low power PCs are suitable for light office work, monitoring systems, entry-level home use, or where low or no noise and low heat production is paramount to  high performance.

High reliability PCs are intended to continue to run and be available to use at any time.  This may be a requirement for industrial, manufacturing, school work, etc.  Often, this term is applied to servers, with standard PCs generally having a high level of availability.  This term may be applied to desktop PCs which are never powered off as well.

Specialized PCs are those which have an unbalanced focus on one or two specific aspects.  File servers may have a very low end CPU, a bare minimum of RAM, no video, but tens or hundreds of TBs of storage.  Digital art computers may have lots of RAM, and very fast GPUs designed for art rather than gaming, and a moderately fast CPU, with just enough storage for the finished art.  Gaming computers, on the other hand, will often focus on faster CPUs with more cores, coupled with high end gaming graphics cards, and very fast storage I/O.

It may seem odd to mix these variables with certain use case types.  However, a secretary who only answers the phone to schedule appointments or relay communications may require a low-end highly reliable PC.  This PC may have a low-end CPU, on-board graphics (which honestly are getting better by the year), limited storage space on slower drives, and bare minimum RAM – but all of the components are higher quality to ensure that no hardware issues prevent the secretary from working.

Low-end PCs are usually the least expensive options and build-outs.  They will work to do light duty work, from Grandma checking for her grandson’s emails to Uncle Bill using it to update client websites.  These systems are the first to be outmoded by technological advances, becoming useless with advances in web tech, or unable to run new OSes.  Generally, they should expect to have a 2-3 year life span, unless they are specialized systems where power and performance will never be an issue.  These may have an Intel I3 or AMD Ryzen 3 CPU.

High-end PCs are generally more common, and more general purpose.  These are PCs designed for light gaming, heavy web use, developers who compile code on their PC, heavier office work,  etc.  These are often the $800-$1600 PCs sold by name brands, and include CPUs in the i5/Ryzen 5 series.

Gaming PCs often pull from High-end PCs and from Enthusiast PCs.  A good gaming PC would qualify as a High-end PC, with the addition of a high quality GPU, and specific changes to give the user better performance for gaming with AAA-title games, and high-level graphical games.  Often these PCs will be used for digital art, audio/video rendering and streaming, CAD/CAM production, stock trading (mostly due to having the capability to drive multiple high-resolution monitors), and other similar situations where high bandwidth, low latency, massive work is done.  These are often used with i7/Ryzen 7 or better CPUs

Enthusiast PCs take the gaming PC to the next level, often being used for gaming, where a very high resolution with a very high FPS is desired, regardless of the complexity of the game.  These systems are often used for render farms, crypto-currency mining, high-level development, virtualization environments, home servers, and “just because” These will have at minimum I7/R7 CPUs, and often are based on higher end CPUs, such as I9/Ryzen 9 series, Ryzen Threadripper, Xeon workstation CPUs, or sometimes Ryzen EPYC and Intel Xeon server CPUs.  They are often overkill for daily use, or where financial gain is not involved.  These are often designed as higher end gaming PCs, with emphasis on some other aspect, relying on high quality, higher tier components to ensure high reliability and availability.

All of these can be scaled to various levels, where the amount of RAM, Storage, CPUs/cores, GPUs, etc are balanced in a way to best suit the needs of use.

Secondary Use:
  Not all computers are multi-use, some have a single use and will never see any other use until it’s tossed in the scrap pile.  However, the great thing about PCs is that they are by nature much more general use than purpose-built computers, as seen in industrial controls, embedded systems, or even appliance and automobiles.

The secondary use may not even be how the system is usually used.  However, if the computer will be sued for gaming some of the time, and writing web pages the rest of the time, then it’s primary use should be as gaming.  It is much easier to under-utilize a computer than to use it for heavier loads than it was designed for.

There’s very little difference between primary and secondary uses.  Let’s say your PC is used for light home use most of the time, and occasionally for gaming, and that heat and noise is levels are important, but availability is not too important.  Finding a solution that balances all of these is important, and not really difficult.  There are technologies built into modern PC components to put the CPU, GPU, and overall system in lower power modes.  This allows a decrease in noise and heat production when it’s lightly used, but can then ramp up for gaming.  On the inverse, a gaming PC that is occasionally used for web browsing and is off the rest of the time may be better built using higher tier components, so more enjoyment and less hassle is encountered while gaming, at a potential cost of additional heat and noise production.

Environmental Conditions:
  This is one of the most important things to consider when building or buying a PC.  These conditions are air quality, ambient air temperature, humidity, dew point, air flow, aesthetics, accessibility, safety, mobility, shelter,  even power conditions and network access, and changes in these conditions.

Ideal conditions are:
Air quality: Computers should always be in the cleanest air possible, there are specialized computers built in IP66 cases for highly contaminated air, but require external cooling.  The dirtier the air, the more often the system will need to be powered down and cleaned.
Temperature: 40-70 degrees Fahrenheit ambient air, with 80+ being abusive.
Humidity: Under 60%, above 60% the system is in danger of various things, including excessive dust collection. The lower, the better.
Dew Point: This is a continually changing variable, however, you never want a PC to be powered in an environment where dew can form in, or even on it.
Air Flow: This is a two-sided condition.  The first is the amount of fresh air available to the computer, and the second being the amount of air flowing through the computer.  If either of this is hindered, the other will be adversely affected.  A computer with no through-case air flow, even being in a 40 degree room will be unable to move heat away from the components properly.  A computer with lots of through-case air flow, but which is in a small or sealed space may be recycling the same hot air through the case – causing heat dissipation from the components to be sub-optimal, or even non-existent.
Aesthetics: This is the most trivial of conditions, however it’s important to consider the aesthetics when it is non trivial.  A cheap, ugly case may be inappropriate for a public area in a business with high aesthetics.  Cases can come any almost any color as well, and a black case may be an eye sore in a brightly colored and themed space.
Accessibility:  No matter how awesome of a computer is built, there will be times when cables need to be plugged in or removed, the power and restart buttons will need to be pressed, and of course, the case will need to be opened for cleaning.  Bolting a computer 9 feet up on a wall may look cool, and may even be great for cooling, but can it be reached easily?  Adversely, security may be a concern as well.  Should it be accessible by just anyone?
Safety:  Lots of people love to put their computers under their desk – where it is vulnerable to being kicked, have drinks spilled on it, pets and children to mess with it – which is potential a hazard to both the pet/child and the PC.  This is an electrical device, and all the hazards therein apply.  This is not just in regards to electrocution hazards, but also flammable atmosphere concerns – PCs should not be used near flammable chemicals (such as next to gas can in a mechanic’s shop)   Cases can have very sharp edges, and even those plastic fans can mangle a child’s finger.
Mobility:  Will your computer be moved often?  Would a high-end laptop be a better choice? In a time long ago, there was a social gathering called a LAN party, before the internet was fast and quick enough to support large multiplayer games.  Gamers would build computers that were able to be setup and disconnected quickly.  Something else to consider here is wired vs wifi networking.
Shelter:  The vast majority of PCs are installed and use indoors.  However, there are occasions when a PC is to be used outdoors.  Along with all of the previous environmental conditions, steps should be taken to ensure that the PC is not going to be rained on (or sprayed with a hose), is not allowed to be in direct sunlight in states where sunlight beats down with lots of heat, etc.  This is more to point out that a PC can be installed in a non-indoor, non-conditioned space, but that precautions must be taken to protect the PC from the elements.
Power and Network:  In some states (LOOKING AT YOU FLORIDA!) one of the worse dangers to PCs is the electrical power it’s plugged into.  From over-voltage spikes to under-voltage dips and brownouts, these power conditions can put undue additional strain on the PSU, and thus the whole computer.  To combat these issues, an on-line battery backup unit (Uninterruptible Power Supply – UPS) should be used.  The UPS should also be plugged into a good quality surge protection unit, specifically to protect the UPS from power conditions.  In the old days, dial-up model expansion cards would often take a surge through the phone line, and if the modem itself was not destroyed, it would pass the surge through the PCI (or ISA) slot into the motherboard.  Now, with Ethernet being much more prevalent in home and office, and often with hundreds of feet or even miles of connected wiring, surge dangers are also much more prevalent.  This can be remedied either by using wifi, or utilizing surge protection on the network.  Note that wifi is much more vulnerable to other issues, including RF interference which can cause connectivity issues.
Changes in these conditions:  It’s possible that the perfect setup is achieved, and one day things need to change.  How capable is this PC to surviving these changes?  Honestly, this isn’t something once can properly plan for in all situations, but is something to keep in mind none the less.

Operating System (OS):
There’s many,  many OSes which have been produced, marketed and made available over the years.  Many of these are still in use and keep viable with updates and upgrades.  Not all of these OSes are compatible with each other, or have support for general use.  There’s 3 types of OS: Server OS, Desktop OS, and niche OS.  There are some lines blurred between these, but we’ll keep away from those lines here.

First, it must be understand that an OS is simply software which is loaded onto a computer to provide some form of user interface to run and load other software.  With the advent of virtualization, OSes can be installed in other OSes (called Hypervisors) or other software.  We’re going to focus on OSes installed on hardware however.

Server OSes are installed on hardware (traditionally to high available, high reliable, high quality computers)  These OSes are intended for continual operation, multiple user access – either directly or through software services such as a web server or multiplayer game, or which host high level services for other systems to access, such as SQL, file servers, etc.  Common, modern server OSes are boiled down to four categories: UNIX, Linux, Windows Server, and macOS Server (which is based in part on aspects from UNIX, but with many, many changes)  These OSes have hardware and ISA requirements, and as such, not every server OS is capable of being run on any server host.

niche OSes can be further organized into sub-categories for scientific, industrial, embedded systems, academia, and proof of concept.  Some prominent niche OSes would be MINIX, Blackberry OS, Symbian, Windows Phone, ReactOS, and MenuetOS.  Android and Apple’s iOS also fits here, as they’re intended and practical use is in embedded systems and phones.

Desktop OSes:  This is more likely the route you will take, if you’re reading this blog post for your first bits of information.  These are generally available as commercial products, but there are many freely available and even open source OSes available as well.  Some of these OSes work only on specific computers, and some have ports available for many computers and ISAs.  The most common OSes are Microsoft’s Windows, Apple’s macOS (with all of it’s name changes over the years), GNU/Linux (with all of it’s variant distros and specialties),  and Chrome OS.  Many distributions of GNU/Linux have ports available for ARM and x86-64.  Windows has been ported to various systems in the past, however their primary architecture is x86, with heavy emphasis on x86-64 now.  Aside from some legal hindrances, Apple’s macOS is available only for Apple built computers, even though these computers are merely modified x86-64 PCs (so little so that Windows and GNU/Linux can be installed on modern Macs)  Chrome OS is also generally hardware locked to Chromebook netbooks produced by Google and partners.  There are also efforts of porting various opensource OSes, such as Apple’s macOS base system, “Darwin” and Android to the x86 platform, with mixed results.  The difficulty level is increased exponentially with these alternatives.

Assuming you’re building an x86-64 PC, and you’re not wanting to violate licensing and terms & conditions, you’re left with two choices: GNU/Linux and Microsoft Windows.  There are many alternatives, with varying results and difficulties of installation as well, but again, being you’re reading this for info, we’ll go with the two simpler options.

Currently, there’s (practically) only 1 supported Microsoft OS – Windows 10.  Windows 8 is nearing end of life, and will need to be upgraded to Windows 10, and so should not be seen as a viable route.  Windows 10 has several varients as well: Home and Pro, as well as several editions for Enterprise, Education and others.  The decision is between Home and Pro here.  For the average user, there’s little in the way of relevant differences.  The most important being data security functions which are included with Pro, but not in Home.  Additional higher level add-on components may also not be available to Home edition.  There are many pages dedicated to explaining the differences, and as such will leave that to those pages and sites.  From herein, we’ll refer to Windows 10, and not specific editions.

As for GNU/Linux – there’s a handful of trusted and easy distros.  A distro is similar to Microsoft’s editions, however, as Microsoft produces multiple editions of Window 10,   distros are produced by different companies, groups and organizations.  Some GNU/Linux distros are not freely available, and are offered as support commercial products.  These companies have freely available distros as well, however.  Some of the top GNU/Linux distros are Linux Mint, Arch Linux, openSUSE, Debian, Fedora (from Red Hat), Elementary OS, and my personal favorite, Ubuntu – specifically Xubuntu, which is a variant of Ubuntu which comes with the XFCE Desktop Environment.  There are way too many companies, distros, varients, desktop environments, and software packages to cover them all here.  As there are freely available distros with all kinds of combinations of these, this is something that only personal experience can prove to decide the best distro for each person.

For a more general use, friendly and easy to use GNU/Linux distro, I highly suggest starting with Ubuntu.  It has a well polished interface, high compatibility for software, and is a stable and proven system.

One very important aspect of the OS to consider is the availability of software.  The vast majority of commercial productivity and entertainment software is written for Microsoft Windows.  This includes almost every AAA title game, with many have some ports to GNU/Linux with less than great success.  For the last few decades, the “good” games have been written to use DirectX.  This is a platform available only on Microsoft Windows, and XBox consoles.  If you’re intending to build a gaming system, you’re pretty much left with choosing Windows 10.  There are some open source projects to implement DirectX compatibility on GNU/Linux, again, with less than great success.  If you never want to run many high end games, GNU/Linux is a less expensive choice.  Check with the games and other software to see if there is a GNU/Linux port.

Displays and outputs:
It used to be that PCs could have a single monitor, a printer, and a speaker – and that was the extent of any output from the computer.  Now, it’s common for PCs to have 2+ monitors, network connected printers, full surround speakers, VR headsets, LED displays (and RGB lights), projectors, connectivity to wireless devices and headsets, and many many more human interface devices.  There are also many more input devices now than there used to be.  The mouse, for example, did not always exist.  Game controllers, 4k cameras, high resolution microphones, video capture cards, even TV tuners were a big thing for a while.  We even have network attached storage systems now.

We used to have a multitude of connectors on PCs too.  RS-232 Serieal, Parallel/printer ports, video out (sometimes not even VGA) PS2 keyboard, PS2 mouse, 3.5mm stereo out and mic in, and often there were dedicated game controller ports.  Connectivity was by external modem, and then with internal modems, token ring cards, and eventually Ethernet ports built-in on the motherboard.  There were expansion cards to give additional ports, some being proprietary.  At one time, hard drives and CD drives required expansion cards.  We are moving closer to everything being connected via USB, with technology steaming ahead to utilizing USB-C ports for USB 3.

It is important, at least for the near future, to ensure your new computer has the ports and functions you’ll need, in the quantity you’ll need.  USB 3 ports are becoming more and more important, in both USB-A and USB-C (as for the PC)  Many devices, however, are still only USB 1.1 capable, and have the possibility of negatively impacting USB 3 devices.  Keyboards, mouses and even headphones don’t require the extra bandwidth provided by USB 3.  The other important port to ensure you have is Ethernet.  Currently 1000mb/1gb Ethernet is the most common, however 2.5, 5, 10 and even 100 gigabit Ethernet is available (with increasing costs associated)  The future may determine that 2.5 or even 10gb will be the mainstay for high end networks.  Please note that just because the PC is connected to the network with 10gb Ethernet does not mean you will have 10gb internet, it just means computers on the same local area network (LAN) can communicate with each other at the faster speeds, provided all the devices between the two computers are capable of the higher speed.

I highly suggest finding a motherboard or pre-built PC with 2 USB 1.1/2.0 ports, 4+ USB-A 3.x ports, and at least one USB-C 3.2 port.  A “six-pack” of 3.5mm ports is optional, for multiple speakers and audio inputs.  There is limited common use for serial ports, parallel/printer ports, firewire, PS2, or game ports.  Even modern external storage devices are moving to USB 3.x – thumb drives, ssds and hard drives.  The only exception here is if you have a nice keyboard, or are intending to be a “pro gamer” and want to use a PS2 keyboard.  PS2 keyboards are segregated from other devices, and as such are less latent and have less chance of interference from other devices or system issues.

With a motherboard with plenty of USB 2 and 3 ports, there’s limited use for expansion cards in most cases.  The last few expansion cards most people will ever use are graphics cards, high-end audio cards, and video capture cards.  Other cards for Enthusiasts and business use include RAID cards, additional network cards, high-end storage cards.  There are few uses for expansion cards outside of niche cases.  Someone building a 2 monitor gaming rig might only need 1 PCIe x16 slot for a high end GPU.  Others might have need to install 3 or 4 x16 cards, however some of those cards will need to be installed in x8, x4 or even x1 slots, limiting throughput and performance.  Almost always, the x16 slot nearest the CPU should be used for the (primary) graphics card.  Check the manual for any specific motherboard to see what the physical and logical form of each expansion slot is.  Many times, the x16 slots furthest from the CPU are logically only x8 or x4, and in some cases, x1.  This allows these slots to be used with some x16 cards which are capable of being used in slower modes of operation.  Generally speaking, most people will never need to worry about this.   The biggest concern being if the motherboard has 1 or 2 x16 logical slots, for 2 graphics cards.

Storage:
In the old days, computers had to be programmed every time they were turned on.  Let’s skip a whole bunch of generations of storage technology to the Floppy diskette and IDE drive era.  Next were CD and DVD drives, at first using expansion cards, then using IDE/ATA.  Then SATA came along, and hard drives and disc drives used SATA, with floppy drives falling out of favor.  SATA has been around a long time now, but is being replaced again.  There’s several new technologies now, but the dominating form-factors are M.2 and U.2, with SATA-III remaining relevant.  Things get more complex with M.2 being either SATA or PCIe.  These two standards are not interchangeable. M.2 M-key connectors are used for M.2 NVMe SSDs;  M.2 B-key connectors are used for M.2 SATA SSDs, as well as some other small expansion cards, such as wifi adapters.  U.2 drives are physically similar to standard 2.5″ SATA drives, with a higher bandwidth connector based on PCI Express and SATA.

Until advances are furthered, M.2 NVMe and SATA-III (SSD or HDD) drives should be target products for purchase.  NVMe drives are a little more costly per GB, than SATA-III SSDs or HDDs.  NVMe drives utilize the fastest current possible method to read and write data to the drive.  SATA-III HDDs are by far the least expensive per GB, but are also the least performant.  The cost of SATA-III SSDs have come down in the recent years.  It’s not uncommon to see modern PCs with a mix of 2 or all three of these drive technologies.  Often, a smaller capacity NVMe is used for the OS boot drive, with a SATA-III SSD being used for applications and commonly used data, and HDDs used for large quantities of storage.  It is possible to use a single NVMe, SSD or HDD drive for all data.  SATA-III SSDs are still the best balance of cost, capacity and performance, as well as still much more compatible as not all motherboard ship with M.2 or U.2 connectors, and if they do, the M.2 may be a B-key, SATA type (which is not the same as standard SATA-III, and not directly physically compatible)

A good mix-device setup might have a 120GB NVMe, 1TB SSD and one or multiple 3TB+ HDDs, for storage and on-system backups.  A USB-C/3.2 external hard drive is a great solution for near-line backups.  Off-site backups for important data should be performed as well, either to a NAS at another location, or to an online storage provider, such as Google Drive or OneDrive.

Most people don’t require more than 120GBs to 1TB for their computers, and a single NVMe can handle all of their storage needs.  Gamers, on the other hand, may have 4-5 TBs of game installs, or if you’re like someone I know, almost 8TBs!  These are all things to consider when designing or picking your new PC.  It’s far easier to have extra space than to have to upgrade storage.  It’s also more costly.  That’s a balance each person needs to figure out for their own use and needs.  External USB drives are always an option, but can be cumbersome.

The main aspects of storage are capacity, I/O speed, cost, physical size, and compatibility.  A ultra fast 120GB NVMe may cost more than a 2TB SATA-III SSD, but both may still be more expensive than a 12TB 3.5″ SATA-III HDD.  Gamers, designers, artists, and similar users may opt for SSDs, so they have a good balance of read/write speed and capacity.
A secretary answering phones may only need the performance of an SSD, but an NVMe would ensure the system don’t stutter on I/O, and the much smaller size means her workstation can be in a smaller case.

Comfort:
This is more intended to deal with various aspects outside of the PC build itself.  However, some things to consider with the PC case and overall build is lights, noise and heat production.  RGB lights have been “all the rage” for a while now.  They’re pretty, but they can be distracting and annoying.  They can even cause sleep issues if the PC is in the bedroom.  The same with noise.  Heat, on the other hand, should never be an issue provided environmental conditions are maintained properly.

Other aspects of comfort in regards to a PC is the desk height for the keyboard and mouse,  depth between eyes and monitors,  height of monitors to the level of the users’ eyes, number of displays – turning one’s head constantly to view extremely wide screen areas will cause fatigue.  For a white, gloss bezels on monitors and tvs was in style, however it has been proven time and again that lot shine bezels is much less distracting and much easier on the eyes.

It is important that any computer chair be tested by the user.  If the chair will be used for more than an hour at a time, it should be very comfortable.  There are three types of chairs that are best suited for computers – and they’re all essentially the same thing: an office chair.  Computer chairs, Gaming chairs and office chairs are all very similar, with the differences being most in cost.  Computer chair is a marketing term used for chairs more suitable for use at a computer… some how different than office/desk chair.  Gaming chairs’s fame comes from the more interesting designs and colors.  Standard office/desk chairs are often much more suitable for long sessions at the PC.  These chairs are designed for 8 hour shifts.  Whenever possible, a chair rated for 300lbs is better, even if you’re only 120lb little girl.  They are often much better built, with better quality padding and surface materials, and will last a very long time for anyone under it’s weight limit.  For those who are heavier (read: HEAVIER, not FATTER, example Jason Momoa is heavier, but not fat) getting a chair rated above their weight is critical for longevity and personal safety, as well as prolonged comfort.

Every other component which you will directly interact with – keyboard, mouse, headset, VR headset, what ever you will physically be touching or wearing, should be tested for comfort, and any annoyances be corrected.

The experience can be completely ruined, even with a$10,000 PC, if the user is not comfortable.

Leave a Reply

Your email address will not be published. Required fields are marked *