Pages

Jumat, 10 Agustus 2012

Memory RAM

Internal random access memory (RAM) is computer memory that is built directly into the chip microcontroller, such as computer central processing unit (CPU). Internal RAM is used by programmers to increase the speed of programs that directly addressed the function of internal RAM, and ensure that the process of data queued to be processed more quickly Internal RAM can speed up processor performance for frequently used instructions can be sent to the CPU faster than pulling them from the external RAM.

Know Your Shape Memory RAM Module

Type of RAM module to a computer system (desktop):
  • DIMM (Dual In-Line Memory Module): is the most common type used today. The form of small circuit board containing memory chips RAM. DIMM memory is standard in desktop computers today.
  • SIMM (Single In-Line Memory Module): is a type of memory module that contains a computer chip RAM used the early 1980s to the 1990s. This differs from the DIMM. SIMM modules must be installed in pairs in the slot on the motherboard. Now it is not used anymore.
  • RIMM (Rambus In-Line Memory Module): This module uses artificial Rambus.Inc RAM chip,

Main Board

Motherboard (mainboard) is the circuit board where a variety of electronic components  connected to each other as micro processor and memory (RAM, ROM, BIOS)  together with other controller chip. The motherboard connects all the computer equipment and make it work together so that the computer running smoothly.
One form of the motherboard.

 Motherboard components
1. Power Supply
As the name implies, Power supply unit (PSU) serves to supply power to other components on the PC. All PC components (except power supply) will gain power from the power supply. Specifications that are listed

History of AMD

AMD processors are the most stringent competitor for Intel processor. AMD has always made ​​a faster processor than Intel. That's why AMD is more suitable for gaming and running heavy applications (such as 3-dimensional simulation). But because AMD is always faster than Intel, then AMD is less well-known heat stable and faster. Can be said if the issue of speed, AMD came out ahead. But the problem of durability and stability, more unggul.Selain Intel's, AMD is always cheaper than Intel processors, although the same processor speed.
AMD 386SX
Processor is said to be the exact clone of the Intel 80386 processor design. Start from here that AMD began to appear

Brand Type of Processor Types

Types of processors: Intel, AMD, Apple, VIA Cyrix, IBM, IDI.

  • Intel has a number of technologies, namely: Pentium, Celeron, Dual Core, Turlon, Athlon and Atom.
  • Amd has some technologies, namely: Amd Phenom II processor, AMD Sempron processor, AMD Athlon II processor-in-a-box, athlonx2 Amd dual-core processor, AMD A-series-in-a-box.


PROCESSOR is an IC that controls the whole course of a computer system that dii use as a center or brain of the computer that functions to perform calculations and run errands.

The History of Intel


  1. In 1971, Intel issued a series MCS4 processor that is the forerunner of the i4040 processor. 4-bit processor is planned to be a brain calculator, the same year (1971), made revisions to the intel i440. Originally ordered by a Japanese company to manufacture calculators, it is much more powerful processors than expected, so Intel bought the rights to those of Japanese companies for the development and further research. This is where the precursor to the development towards a computer processor.
  2. In 1972 appeared the first i8008 8 bit processor, but somewhat less preferred because multivoltage and emerging i8080 processor, here there are so triple voltage changes, use of technology NMOS

Intel Company History


This company was founded in 1968 in Santa Clara, California by Gordon E.More, he is an expert on chemical and Robert Noyce who is a physicist. collaboration of two human sophistication presents extraordinary production. Moore is also one famous person in the semiconductor industry thanks to his theory known as Moore's law (Moore's Law) that essentially discussed the human ability to develop the transistor density doubling every two years. The third person accompanying Intel since the days pioneered in the 1980s to the industrial growth in the 1990's that Andy Grove (a chemical engineer). He can be referred to as Steve Ballmer was Intel, a key figure and leader of the company's strategic business Intel. Since the late 1990's until the time this article was created, the Intel processor was asked as the most successful and

Kamis, 09 Agustus 2012

The fifth generation computer


Defining the fifth generation computer becomes quite difficult because the stage is still very young. Imaginative example of fifth generation computer is the fictional HAL9000 computer from the novel by Arthur C. Clarke entitled 2001: Space Odyssey.
HAL displays all the desired functionality of a fifth-generation computers. With artificial intelligence (artificial intelligence), HAL may have enough reason to do percapakan with humans, using visual feedback, and learn from his own experience.
Although it may be the realization of the HAL9000 is still far from reality, many of the functions that had been established. Some computers can receive verbal instructions and are able to mimic human reasoning. The ability to translate a foreign language also becomes possible. The facility is deceptively simple. However, such facilities become much more complicated than expected when programmers realized that human understanding relies heavily on context and meaning rather than simply translate words directly.
Many advances in the field of computer design and technology

The fourth generation of computers


Several vendors announced "fourth-generation computers" and some of the announced "fifth generation computers". This is just a market strategy. Three generations of technological breakthroughs previously distinguished with important electronics - vacuum tubes, then transistors and integrated circuits. The fourth generation came with the development of a less striking, only in the form of computers and software have advanced somewhat, and this generation of computers may not be as lucky as the previous generation in terms of 'rule' the world market before the next new technological breakthroughs. This is why some people often refer to this generation as the generation of 3 ½.
Micropocessor
One of the many contributions in the appearance of the next generation of computers is the microprocessor. Microprocessor contained in a single silicon chip. Microprocessor was first demonstrated by Texas Instruments in 1971. The price could be a few dollars and can be found in any of the machines to satellites.
Microcomputer
Processing tool is a small, relatively inexpensive, but high performance. Microprocessor 'contained' in

The third generation computers


Although the transistor in many cases have exceeded the ability of a vacuum tube, but the transistor is too large to produce heat, which could potentially damage the parts inside the computer. The stone was discovered later quartz or quartz rock that can eliminate this problem. An engineer at Texas Instruments, named Jack Kilby developed the integrated circuit (IC: integrated circuit) in 1958. IC can be combined three electronic components in a small silicon disc that was made from quartz. then the scientists managed to fit more components into a single chip, called a semiconductor. so as to produce a computer that is getting smaller because the components therein can be squeezed onto the chip. The third-generation development to

Second Generation Computer


1948, the invention of the transistor is very influential on the development of the computer era. Transistors replaced vacuum tubes in televisions, radios, and computers. so the impact on changes in the size of the electric machines which initially has a large size into smaller size.
The transistor used in computer technology began in 1956. Other findings in the form of magnetic core memory, a second generation computers smaller, faster, more reliable, and more energy efficient than its previous predecessor. The first machine that can take advantage of new technologies is a supercomputer. IBM created a supercomputer named Stretch, and Sprery-Rand called LARC. This computer, which was developed for atomic energy laboratories, could handle large amounts of data, a capability that is needed by atomic scientists. LARC computer machine was very expensive and tend to be too complex for business computing needs, so that its popularity is limited. There are only two LARC ever installed and used, ie, one

The first generation of computers


Beginning on the first generation computers was during the second world war, the countries involved in world war that sought to develop computers to maximize the ability to set the strategic importance by the computer. This affects the increase in funding for computer development also participated accelerate the growth of technical progress. In 1941, a German engineer named Konrad Zuse built a computer, the Z3, to design airplanes and missiles.
Elsewhere there allies Parties are also others who are also making progress in the development of computer capabilities. In 1943, the British successfully completed a secret code-breaking computer called Colossus given that serve to break the secret code used by the German state. Effects of The Colossus is actually not much affect the development of the computer industry, it can happen for two reasons, namely: the first, Colossus computer is not versatile in the English language "general purpose computer", he made only in order to solve the secret code. Secondly, the existence of the machine was kept secret until decades after the war

historical development of computer


Today who is familiar with computer advanced tools that can make anything, could make office work to assist the process in the factory. In addition to today's computers are also for entertainment and play game.Komputer are tools used to process the data according to the orders that have been formulated. Said the computer was originally used to describe people who perkerjaannya perform arithmetic calculations, with or without a walker, but the word is then transferred to the machine itself. Originally, the processing of information is almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics.
To discuss the article about the history of computer development is not separated from the discoveries of mankind since time immemorial in the form of mechanical or electronic means. Computers are commonly used today is a long evolution from the previous computer since it was first created in which the computer is very limited due to the use and usefulness has not been commercialized and memerlukam special skills to

5 Ways to Connect Laptop to TV

If you would display video or images stored on the laptop to the big screen while you do not have a projector, you can connect your laptop to the TV to get a bigger screen than just the laptop screen.
There are several ways to connect your laptop to the TV screen. Here are 5 ways to connect the laptop to the TV can be followed:
  1. Connections using S-Video port
    Super Video or so-called S-Video port is available on the laptop that features TV-out. If your laptop has the feature, you can connect your laptop to the TV analog and digital.
  2. Connection using port DVI
    DVI or Digital Video Interface is usually also available on many laptops on the market. Usually the latest model TV also has a DVI port this. To be able to use audio with a DVI connection, it would require a separate audio cable with a DVI cable.
  3.  Connection using Wireless Converter