Computer Architecture
Computer Architecture
Computer architecture is a specification detailing how a set of software and hardware technology standards interact to form a computer system or platform. In short, computer architecture refers to how a computer system is designed and what technologies it is compatible with.
A very good example of computer architecture is von Neumann's architecture, which is still used by most types of computers today. This was proposed by the mathematician John von Neumann in 1945. It describes the design of an electronic computer with its CPU, which includes the arithmetic logic unit, control unit, registers, memory for data and instructions, an input/output interface, and external storage functions.
Computer architecture comprises rules, methods, and procedures that describe the execution and functionality of the entire computer system. In general terms, computer architecture refers to how a computer system is designed using compatible technologies.
Computer architecture is the organization of the components which make up a computer system and the meaning of the operations which guide its function. It defines what is seen on the machine interface, which is targeted by programming languages and their compilers.
Computer architecture is a specification describing how hardware and software technologies interact to create a computer platform or system. When we think of the word architecture, we think of building a house or a building. Keeping that same principle in mind, computer architecture involves building a computer and all that goes into a computer system.
THE THREE CATEGORIES OF COMPUTER ARCHITECTURE
There are three categories of computer architecture, and all work together to make a machine function. Computer architecture consists of three main categories.
- System design
- Instruction Set Architecture (ISA)
- Microarchitecture
System design
- System design includes all the hardware parts, such as CPU, data processors, multiprocessors, memory controllers, and direct memory access. It also includes the graphics processing unit (GPU). This part is the actual physical computer system.
- This includes all hardware components in the system, including data processors aside from the CPU, such as the graphics processing unit and direct memory access. It also includes memory controllers, data paths, and miscellaneous things like multiprocessing and virtualization.
Instruction set architecture
- This includes the CPU’s functions and capabilities, the CPU’s programming language, data formats, processor register types, and instructions used by computer programmers. This part is the software that makes it run, such as Windows or Photoshop, or similar programs.
- This includes the functions and capabilities of the central processing unit (CPU). It is the embedded programming language and defines what programming it can perform or process. This part is the software that makes the computer run, such as operating systems like Windows on a PC or iOS on an Apple iPhone, and includes data formats and the programmed instruction set.
- This is the embedded programming language of the central processing unit. It defines the CPU's functions and capabilities based on what programming it can perform or process. This includes the word size, processor register types, memory addressing modes, data formats, and the instruction set that programmers use.
Microarchitecture –
- Microarchitecture is also known as computer organization and this defines the data processing and storage element or data paths and how they should be implemented into the instruction set architecture(ISA). These might include DVD storage devices or similar devices. It is the hardware implementation of how an ISA is implemented in a particular processor.
All these parts go together in a certain order and must be developed in a pattern so they will function correctly.
Concept/Detail of Computer architecture
In computer engineering, computer architecture is a set of rules and methods that describe the functionality, organization, and implementation of computer systems. The architecture of a system refers to its structure in terms of separately specified components of that system and their interrelationships.
Some definitions of architecture define it as describing the capabilities and programming model of a computer but not a particular implementation. In other definitions, computer architecture involves instruction set architecture design, microarchitecture design, logic design, and implementation.
Computer architecture comprises rules, methods, and procedures that describe the execution and functionality of the entire computer system. In general terms, computer architecture refers to how a computer system is designed using compatible technologies.
Computer Architecture is a structural and behavioral design of a computer system that is made by the integration of hardware components.
In computer architecture, we can understand the functionalities of a system. It deals with high-level design issues.
History of Computer Architecture
The term architecture in computer literature signifies the efforts of Sir Lyle R. Johnson and Sir Frederick P. Brooks, members of the Machine Organization department, in 1959. Sir Johnson noted his description of formats, instruction types, hardware limitations, along with speed improvements. These were at the level of system architecture, a term that is more useful than machine organization. Succeedingly, a computer user can use that term in many less precise methods.
Earlier, computer architects designed computer architecture on paper. It was then directly built into a final hardware form. Later, they assembled computer architecture designs materially in the form of transistor-transistor logic (TTL) computers. By the 1990s, new computer architectures are typically built, examined, and tweaked inside another computer architecture, in a computer architecture simulator, or the interior part of an FPGA, as a microprocessor before perpetrating to the ultimate hardware form.
Categories of architecture/Types of Computer Architecture
Categories of architecture
Here are the various categories of architecture that exist in our computer systems.
Von-Neumann Architecture
Harvard Architecture
Instruction Set Architecture
Micro-architecture
System Design
Von-Neumann Architecture
John von Neumann coined and developed this architecture. The computer we are using nowadays is based on the von Neumann architecture. It has some concepts. It is also known as Princeton architecture. It renders a unique design for the electronic digital systems having the following components:
- A Central Processing Unit (CPU) with arithmetic and logic unit (ALU) and processors with attached registers.
- A memory that can store data and instructions.
- External mass storage or secondary storage.
- A Control Unit (CU) with the ability to hold instructions in the program counter (PC) or instruction register (IR).
- Input and output mechanisms and peripherals.
The von Neumann design thus constitutes the foundation of modern computing. The Harvard architecture, a similar model, had committed data addresses and buses for reading and writing to memory. It wins because von Neumann's architecture was easier to execute in real hardware.
John von Neumann wrote his first book on architecture in 1945. A Control Unit, Arithmetic and Logic Unit (ALU), Memory Unit, Registers, and Inputs/Outputs are all part of his computer architecture design. The Von Neumann architecture is built on the idea of a stored-program machine, which stores all instruction and program data in the same memory. Most computers still use this style.
THE VON NEUMANN ARCHITECTURE
Mathematician John von Neumann and his colleagues proposed the von Neumann architecture in 1945, which stated that a computer consists of: a processor with an arithmetic and logic unit (ALU) and a control unit; a memory unit that can communicate directly with the processor using connections called buses; connections for input/output devices; and a secondary storage for saving and backing up data.
The central computation concept of this architecture is that instructions and data are both loaded into the same memory unit, which is the main memory of the computer and consists of a set of addressable locations. The processor can then access the instructions and data required for the execution of a computer program using dedicated connections called buses – an address bus which is used to identify the addressed location and a data bus which is used to transfer the contents to and from a location.
THE PROS AND CONS OF THE VON NEUMANN ARCHITECTURE
Computers as physical objects have changed dramatically in the 76 years since the von Neumann architecture was proposed. Supercomputers in the 1940s took up a whole room but had very basic functionality, compared to a modern smartwatch which is small in size but has dramatically higher performance. However, at their core, computers have changed very little and almost all of those created between then and now have been run on virtually the same von Neumann architecture.
There are a number of reasons why von Neumann's architecture has proven to be so successful. It is relatively easy to implement in hardware, and von Neumann machines are deterministic and introspectable. They can be described mathematically and every step of their computing process is understood. You can also rely on them to always generate the same output on one set of inputs.
The biggest challenge with von Neumann machines is that they can be difficult to code. This has led to the growth of computer programming, which takes real-world problems and explains them to von Neumann machines.
When a software program is written, an algorithm is reduced to the formal instructions that a von Neumann machine can follow. However, the challenge is that not all algorithms and problems are easy to reduce, leaving unsolved problems.
Harvard Architecture
Harvard Architecture consists of code and data laid in distinct memory sections. It requires a separate memory block for data and instruction. It has solely contained data storage within the Central Processing Unit (CPU). A single collection of clock cycles is needed. Data accessibility in one memory is done by a single memory location in the case of Harvard architecture.
One typical example is the Punch card. Moreover, modern computers may have the latest CPU processes for both methods but disparate them in a hardware design.
Another notable digital computer architecture is the Instruction Set Architecture. The architecture holds a collection of instructions that the processor renders and surmises. It consists of two instruction sets: RISC (Reduced Instruction Set Computer) and CISC (Complex Instruction Set Computer).
It enables versatile implementations of an ISA; commonly differ in features such as performance, physical size, and monetary price. It empowers the evolution of the micro-architectures, implementing ISA as an exclusive, higher-performance system that can run software on preceding generations of execution.
Instructions and data are both stored in the same memory in a typical machine that follows von Neumann’s architecture. As a result, the same buses are used to transport both instructions and data. This means that the CPU can’t do both (read the instruction and read/write data) at the same time.
Harvard Architecture is a computer architecture that involves separate instruction and data storage and separate buses (signal paths). It was created primarily to solve Von Neumann Architecture’s bottleneck. The key benefit of providing differentiated instruction and data buses is that the CPU can access all instructions and data at the same time.
Another popular computer architecture, though less so than the von Neumann architecture, is Harvard architecture.
The Harvard architecture keeps instructions and data in separate memories, and the processor accesses these memories using separate buses. The processor is connected to the ‘instructions memory’ using a dedicated set of address and data buses, and is connected to the ‘data memory’ using a different set of address and data buses.
This architecture is used extensively in embedded computing systems such as digital signal processing (DSP) systems, and many microcontroller devices use a Harvard-like architecture.
Micro-architecture
Micro-architecture is the structural design of a microprocessor. This computer organization leverages a method where the instruction set architecture holds a built-in processor. Engineers and hardware scientists implement instruction set architecture (ISA) with various micro-architectures that vary because of changing technology. It includes the technologies used, resources, and methods. Using this, the processors physically devised to administer a particular instruction set.
Simply, it is a logical form of all electronic elements and data pathways present in the microprocessor, designed in a specific way. It allows for the optimal completion of instructions. In academe, it is called computer organization.
System design
System design itself defines a design that can serve user requirements like system architecture, computer modules having various interfaces, and data management within a system. The term product development is connective to the system design. It is the process by which we can take marketing information to create a product design.
THE EVOLUTION OF PROCESSORS ARCHITECTURE
Complex Instruction Set Computer (CISC) and Reduced Instruction Set Computer (RISC) are the two major approaches to processor architecture.
CISC processors have a single processing unit, external memory, and a small register set with hundreds of different instructions. These processors have a single instruction to perform a task, and have the advantage of making the job of the programmer easier, as fewer lines of code are needed to get the job done. This approach uses less memory, but can take longer to complete instructions.
The RISC architecture was the result of a rethink, which has led to the development of high-performance processors. The hardware is kept as simple and fast as possible, and complex instructions can be performed with simpler instructions.
Microprocessors are digital systems which read and execute machine language instructions. Instructions are represented in a symbolic format called an assembly language. These are processors which are implemented on a single, integrated circuit. Common microprocessors used today are the Intel Pentium series, IBM PowerPC, and the Sun SPARC, among others. Nearly all modern processors are microprocessors, which are often available as standard on von Neumann machines.
Comments
Post a Comment