Computers not compatible with IBM PC

With the support of the SD-card, two big questions arose at once - hardware support for the SPI bus and the protocol for interacting with the card itself.

In principle, SPI can be implemented entirely in software, but I wanted to have some fun with hardware too, so I heroically set about drawing a byte transceiver in a circuit design. To my surprise, there was nothing complicated about it, and pretty soon I already saw on the oscilloscope screen briskly running 8-bit packets containing exactly what I wanted. By the way, here for the first time I appreciated the ability of the new oscilloscope not only to show a bunch of signals, but also to combine them logically into the appropriate bus. It is much more pleasant to see that the oscilloscope realized that it is the A5 byte that is being transmitted, and not manually watch whether the transitions from 0 to 1 and vice versa are in the right places.

To simplify the task, I did not try to adapt to all types and varieties of cards, but limited myself to the original SD (not SDHC or some other variants) card. With a little programming, and now the contents of the 0-th sector of the map were displayed on the screen. Immediately after that, I brought these functions into some semblance of INT 13h, added INT 19h (boot load) in an embryonic form and saw the following on the screen:

Since at that moment, when reading, only the 0th sector was always read, the bootloader (located just in this sector) did not find the OS to boot, which it reported. But these are just trifles - the main thing is that my circuit has slowly begun to turn into a real computer and has even tried to boot!

Next came the fight against recount physical sectors into logical blocks. Here I also freaked out and instead of defining the parameters (image) of the disk, I simply hardcoded the numbers for a specific instance of the image. I had to tinker with this part - calculations for some reason led to completely unexpected results (I never liked arithmetic in assembler at all). Nevertheless, after some torment, physical sectors / cylinders / heads began to be regularly translated into logical blocks, and it was time to try to boot in a serious way.

Naturally, the download did not go through right away, and I did not expect this. Knowing in advance that a bunch of functions were not implemented in my BIOS, I put stubs on all interrupts, and when accessing an unrealized function, all the necessary information was displayed on the screen - what interrupt and what arguments are being used. Then there was the process of writing a handler for the corresponding function (and even more often - just a temporary stub), and the process continued. All of a sudden, it all stopped at a feature that was missing from the original PC at all - one of INT 2F's event handling functions. I saw that DOS detects the type of PC, and it shouldn't seem to cause interrupts that are missing on this typebut, nevertheless, it happened, and the process stopped. A simple stub did not help, and I did not want to implement the entire function out of principle.

Now I don't remember the whole train of thought (I looked at a lot of things at that moment in the DOS source code and during the boot process), but once again on this "hang" I decided to call a bunch of interrupts (at that moment I had a timer disabled on INT 08h ) and pressed shift key... Suddenly, a miracle happened:

To be honest, quite a lot of emotions flooded me - going from a breadboard with a couple of microcircuits to loading DOS in a month, and even short raids (due to chronic lack of time) seems to be pretty cool (sorry for bragging)!

By the way, with this post I have a still unsolved riddle. The fact is that after the completion of the timer interrupt, DOS began to boot without freezing in this place, but for some reason the message about Microsoft copying is not displayed. It seems that it is also not displayed on a real computer (unfortunately, there is nothing to try). What is the root cause - a mystery covered in darkness. I tried to understand the logic by source codes DOS, but did not immediately see it, and did not want to spend a lot of time. Nevertheless, the question is still tormenting slowly ...

After starting DOS, it was the turn to start other programs. You can probably guess whose turn came first - of course, as they say, the good old Norton Commander. Oddly enough, there was much more fuss with him than with DOS. NC at startup called a wild number of functions, and in some cases it was not possible to do with simple stubs, it was necessary to write at least a minimum of functionality.

However, the problems were more quantitative than qualitative, and soon it was possible to bring the NC boot process to its logical conclusion:

So "interesting" appearance due to several reasons:
- the video adapter did not support the attributes at that time
- I did not have the second part of the character generator, which contains pseudo-graphics, so the symbols from the bottom of the code table appeared in the appropriate places
- some INT 10h functions were not implemented.

In general, I was periodically surprised by how exactly certain functions were implemented in various programsah (and even in DOS). For example, the CLS (clear screen) command called the INT 10h function to move the window up. In this case, the entire available screen area was indicated as a window, and it was shifted by the number of lines equal to the number of lines on the screen. Since I did not expect that someone would use windowing functions at all, I was in no hurry to implement them. The result was evident (or rather, on the screen). However, we will return to the oddities of some programs a little further ...

After launching NC, I had a natural desire to bring it into a divine form. Moreover, this part of the work is sometimes even more enjoyable than trying to start a generally dead device. There were no special problems with pseudographics - just a lot of time for hand drawing symbols (I had a character generator directly in the form of VHDL code). But with the attributes I had to strain a little.

Earlier in the process, I started using some VHDL elements. At first, almost by force - after all, there was a desire to try again to master this language, and then also because in certain cases it turned out to be more convenient than using a circuit design. Even in the video adapter itself, I had to delve into the code - initially 43 (or something like that) lines were supported, but I needed to redo it by 25 lines. And I first tried to make support for attributes by schematic design, but suddenly I began to realize that it might be easier to use VHDL for this. Naturally, everything moved with great difficulty and using the simplest language constructs, but I suddenly began to understand the essence of VHDL - still quite a bit, but already enough to start consciously creating something on it, and not just modifying what was already available.

My fiddling with VHDL was not in vain, and after a while I was able to see something long and familiar:

Yes, you could still notice some imperfections there (such as an attribute shifted by one character), but in general, the 80x25 color text mode worked as it should.

The next in line was the 8259 interrupt controller. At first, the idea arose to try to use an existing one from some project, but for various reasons I did not like any of them (either too primitive, or, on the contrary - I did not understand how they work, but there was no documentation). There was even an attempt to buy a commercial IP (in this case, IP is not an Internet Protocol, but an Intellectual Property), but the manufacturers did not want to bother selling one whole piece ...

In the end, I had to take up a piece of paper and sketch something like a (block) controller circuit, which I then began to implement in VHDL. I did not pursue full compatibility - I needed (at this stage) support for one main mode of priority interrupts, the ability to mask interrupts (also read the interrupt mask) and execute the EOI (End Of Interrupt) command. In my opinion, this should be enough for the vast majority of programs to work fine with this. Looking ahead, I will say that to this day I have not found a single program that would try to do something with the interrupt controller beyond the functionality I have laid down.

Probably the interrupt controller was my first real (albeit small) VHDL project - from start to finish. I wrote it carefully, I was not too lazy even (again, for the first time in my life) to make a test bench (I'm not sure how to translate it into Russian correctly - in fact, a sequence of signals to check the correct functioning of the device). Modeling in the ModelSim simulator showed that the controller was fully operational, after which another was generated from it. graphic symbol and added to my device.

I did not have a normal 8254 timer yet; to generate interrupts at 18.2 Hz, a regular counter was used, which I connected to the interrupt controller. The behavior of the computer showed that everything seemed to work - DOS booted without having to press a key, and the clock finally went into NC. It seemed that the next stage was passed, and you can safely move on.

As it turned out, early on I was happy - at this moment, perhaps the biggest problem in the whole project was revealed. If anyone remembers, NC has a built-in screen saver - "starry sky". After leaving my computer for a while, after returning to it, I found that the stars on the splash screen froze for some reason, in other words, the computer froze. Although I understand that such accidents do not happen, I still wanted to believe in a miracle - that this is an isolated incident. Unfortunately, as always, the miracle did not happen - after full reset and restarting the computer freezes again after an hour or so of work. It became unambiguously clear that there was a problem somewhere, and a very difficult one to find.

To narrow the search as much as possible, I wrote a simple memory test that was run immediately after resetting the processor, without initializing all unnecessary devices such as a timer, etc. In principle, I took the memory error indication with relief - at least, the problem was clearly in the hardware. The only thing left to do is to understand exactly where. And this turned out to be not easy at all.
The fact is that, in general, the scheme involved in the memory testing process is inherently rather primitive. A minimum of logic is involved, except for the processor there are no other complex programmable elements. As a result, after some time spent on analyzing the circuitry, I became more or less confident that it was not a fundamental error in the circuit, but something more random - for example, interference.

With this side of the circuitry, everything was generally bad for me. I knew that I needed to put more blocking capacitors, and that long wires were kind of bad. This was where my knowledge ended. Therefore, I again turned to one of the professional forums for advice. They gave me a lot of advice, sometimes it was difficult to separate really sensible advice from those who advise on the principle of "I will say everything I know at least a little on this topic." I will not describe all this here - too much has been discussed, so this may be the topic of a separate article. As a result of the discussions, my board has overgrown with almost two dozen blocking capacitors and completely lost its original more or less glamorous look.

Unfortunately, the next run of the test showed that the problem did not go away. Perhaps it began to manifest itself a little less often, but it is difficult to say - and earlier a failure could occur in 20-30 minutes, then in a few hours. Now, at least, the payment left overnight in the morning turned out to be guaranteed to fail. In desperation, I returned to analyzing the circuitry and even more attentively studying the diagrams of the processor buses. In one place I had a definite thought, and again I went to the same forum. During the discussion of my idea, I once again received a portion of useful (and sometimes not very) advice, tried to implement some things (first of all, related to a small delay in some control signals), but this did not affect the presence of failures at all.

At the end of the road, a concrete dead end was clearly visible, so I began to test generally crazy ideas. In particular, is the memory chip itself malfunctioning? For testing, I generated a RAM module directly inside the FPGA, which I used instead of external memory... To be honest, I didn't hope for the result - I just did whatever came to my mind. But imagine my surprise when after that the crashes suddenly disappeared! In general, I was somehow not even ready for this, so I did not quite understand how to use this knowledge. It was hard to believe that the memory chip was defective even at that moment. There was also almost complete confidence that I was working with this microcircuit correctly - according to the control signals, everything was as easy as shelling pears. But the fact remained - with the microcircuit, the failure was guaranteed to occur no later than after a few hours of the test, with the internal memory everything worked without failures for several days until I got tired of it.

To clear my conscience, I nevertheless decided to test the memory with a completely different circuit, without using my processor board. In the process of pondering how best to do this, an idea suddenly occurred to me - I realized the only significant difference between using internal and external memory. The fact is that the external memory was asynchronous, and the internal memory was partially synchronous, and it additionally required a signal by which the address of the cell to which the reference was being made was latched into the internal buffer.
I didn’t understand at all how this could relate to the problem of random failures - from all the diagrams it was absolutely clear that my address was holding much longer than the minimum necessary for memory, therefore, theoretically, this could not be the reason. Nevertheless, I immediately drew another register in Quartus, applied an address to it and latched it with the same signal that was used for internal memory... The register output, of course, was fed to the address lines of the external memory. Realizing that I was doing complete nonsense, I ran the test. And the test worked successfully until I turned it off the next day. Then a couple more times with and without a register - it was quite clearly seen that the presence of a register removes the failures completely.

It was completely inexplicable - even on an oscilloscope, I saw that the address signals were holding on longer than it might be necessary in principle, but the fact remained a fact. After a whole weekend of showdown, I spat on it and decided to put up with it, as a given ...

So, DOS was loaded, many programs that did not require a graphics mode were launched, it was possible to move on. Naturally, there was a desire to launch some kind of toy. But for a toy, as a rule, graphics are required, and I did not have one yet. And if for a text video adapter it was possible to get by with a little blood by reworking the existing one, then for graphics it was not so easy.

It was not even about the lack of ready-made solutions. The problem was that I needed almost complete compatibility with a standard video adapter at the hardware level - after all, all games work with graphics directly from the hardware, without using the BIOS. I realized that it's easier to make a video adapter "from scratch" than trying to remake some ready-made one. And, of course, doing it yourself was much more interesting.

So, we are writing our own CGA adapter - even the EGA is a couple of orders of magnitude more complicated, so we won't try it for now. In principle, to begin with, I did spy a little - I found, in fact, sketches of the VGA scan generation module. But it was half a dozen lines, and still not fully working. So, really, they were used as a template to start writing - it was morally easier that way.

Naturally, I do not have a CGA monitor and was not planned, so the idea was to use the VGA 640x400 mode, in which the 320x200 CGA mode was excellent by simply duplicating points both horizontally and vertically.
Generally graphics adapter it turned out unexpectedly easy for me - by this moment the brain suddenly learned to think in VHDL categories, plus there was a small understanding of what can be required from VHDL and what is not. In general, most of my debugging time was spent looking for a completely stupid error related to the digit capacity of numbers (two such problems overlapped and gave a very funny version). For the rest, I began to enjoy how the lines in the editor turn into almost real "hardware" inside the FPGA and do exactly what I want.

At the very beginning, of course, the adapter turned out to be far from perfect and compatible, but Checkit was able to recognize it and even display the first test picture:

By the way, Checkit turned out to be quite useful program - he defined many things in rather tricky ways, which made the whole design more and more PC-compatible. And since Checkit could check all nodes and components, the compatibility was also tested for all parts of the system.

After correcting the most obvious blunders (such as the duplicate point from the previous byte visible in the previous photo), we managed, with some difficulty, to find a game that seemed to even work:

The colors in this picture do not correspond to the original ones - at that moment the palette switching was not done yet, and the colors themselves were not adjusted at all.

Attempts to find working games have shown that game programs, in most cases working directly with hardware, are much more demanding in terms of compatibility than any NC or even QuickBasic. Fortunately, FPGA provided almost unlimited possibilities for identifying the facts of program access to ports of interest, memory addresses, etc. Especially since I could also change the BIOS at my own discretion, this provided an excellent debugging mechanism. By the way, at some point (I don't remember exactly when), Turbo Debugger started working, which also expanded the arsenal of debugging tools.

It immediately became clear that at least the minimum timer 8253 needed to be done. Moreover, the programs tried to use the timer not only for sounds (channel 2), but also actively reprogrammed channel 0, thus changing the frequency of interrupts from the timer, and also used this channel for determination of time parameters.

After reading the documentation for 8253, I felt a little sad. There was a lot to do and not very interesting. Deciding to do it sometime later, at that moment I just climbed onto the same opencores and stole a couple of timer modules. One on Verilog, and a very simplified one, the second - seemingly extremely sophisticated, and even VHDL. Unfortunately, the VHDL timer was connected via the Wishbone bus - this is an open standard for FPGA development. I had never encountered Wishbone before, so I decided to start using the module on Verilog, which looked simpler in terms of the interface.

After fairly painlessly connecting the timer to my system, I ran some simple tests and made sure that the module seemed to work. Moreover, after one more slight modification of the system in the part of the interface with the speaker, the first, but quite correct sounds from a working toy were heard. While the timer could be finished, and move on.

Then I had to make a cardinal decision. Up to this point I wrote INT 10h myself. In text mode, it was still possible to put up with this, but the need to support these functions in graphic modes upset me. Considering that by this moment the passion for programming in assembler was practically satisfied (after all, the fact that at one time it was already necessary to do it in industrial volumes), I acted according to the principle “If the mountain does not go to Muhammad, then he sends it nafig ". Namely, I decided to make my CGA adapter so hardware compatible that the original BIOS could work with it.

In principle, there was no particular difficulty - there are not very many registers, their functionality is extremely simple. Of the implicit things - I had to emulate the status register, in which there are signs of the reverse motion of the vertical and line scanning beam. It turned out quite logically that many programs (including BIOS) actively use this register to avoid "snow" when trying to simultaneously access video memory from the processor and adapter.

For some reason, the process of tidying up the video adapter seemed to me very exciting, and in the end this node turned out to be the most elaborate in terms of compatibility with the original device. Along the way, missing things were added such as switchable palettes, 640x200 mode, etc. By the way, for testing the 640x200 mode, it turned out to be quite difficult to find a program that supports this mode... The only thing that has been unearthed is chess:

In my opinion, it looks pretty nice ...

The original INT 10h handler was very friendly to such an adapter, and I sighed with relief from not having to write things like recognizing a character printed in a certain place on the screen in graphic mode.

The final obstacle to acceptable PC compatibility was, oddly enough, the keyboard. Although this was almost the first thing that I screwed to the project, from the point of view of compatibility, the horse has not been lying around at all. The main problem was that all normal programs work with the first set of scan codes, which was used in the IBM PC. But all keyboards, starting with PC AT, issue at least the second set of scan codes, which is very different from the first. Only the keyboard controller inside the computer converts these codes into the original, first set, and all ordinary programs work with it (even if these programs seem to access the keyboard directly without using the BIOS). Naturally, I did not have any controller (by the way, in PC AT and even in later PC XTs, a separate microcontroller based on 8051 was used for this). The INT 09/16 functions were implemented in my very minimal version, and there was no question of direct operation of programs with the keyboard in general - they (programs) simply would not understand a single scan code.

By this time, I suddenly felt euphoria from owning VHDL - it seemed to me that I had already comprehended the truth, and I could do anything. Therefore, an elegant (as it seemed to me) module in VHDL was written without delay, which performed the transcoding of scan codes. Everything in this module was very nice and good, except for one small detail - it did not work. Moreover, I could not understand the reason for the inoperability, which frustrated and caused bewilderment - there were only a dozen lines.

Once again, turning to the forum for experts, I received a fair amount of really sensible advice. Moreover, my understanding of the very concept of VHDL has once again almost radically changed (including some disappointment). The main thing is that there are no miracles. VHDL (as well as all other HDLs) will not do what cannot be done in the usual way from the available hardware resources. If I write a line that seems to be correct from the point of view of the syntax of the language, but at the same time I have no idea how this can be implemented in hardware, then most likely it will not be implemented during compilation. At the very least, it will not do what is required of it. Also, it's very important to use templates. It turns out that many language constructs turn into correct hardware nodes only when the compiler recognizes the appropriate pattern. Of course, there is some flexibility, but you still need to always remember about the recommended styles for describing certain nodes.

I think it was after these showdowns that I really began to understand the essence of VHDL at least a little bit (and by that time Verilog also ceased to be completely incomprehensible). In a magical way, textbooks on these languages \u200b\u200bsuddenly made sense, and behind the words, the essence of the things described became clear.

In short, having made the converter module a little less beautiful, but much more correct, I received the codes in the first set at its output. Then it remains to feed these codes to the original INT 09h handler, and check the correctness of keystroke recognition with the same Checkit. So the keyboard was also almost 100% hardware compatible.

By this point, I began to feel more and more inconvenience that the schematic design was still the top level of the project. The final impetus that prompted the full transition to VHDL was the change of home computer. I had an iMac Retina on my desk with installed Windows... Unfortunately, Quartus was among the programs that turned out to be completely unprepared to work with this screen resolution. The schematic design became completely unreadable, and none of my attempts to tweak anything made any real improvements. With nowhere to go, I gritted my teeth and took up my text editor.

Oddly enough, everything went more than smoothly. Now I don't even remember whether it was necessary to debug at least something, or whether everything worked right after the rework. In any case, there were definitely no serious gaps, but working immediately became much more convenient and efficient. I immediately remembered the advice of a number knowledgeable people, urging me to forget about circuit design from the very beginning and start with VHDL / Verilog right away. By the way, regarding VHDL vs Verilog - please don't argue with me which is better / worse and why I chose VHDL. Let's assume that I just wanted to, and this is practically true. I will not discuss this topic anymore ...

With the transition to VHDL, the last module on the circuit design, the SPI interface, was also completely redone. If you remember, he provided hardware reception / transmission of only one byte, and around this it was necessary to perform a number of preparatory steps. Coupled with a slow processor (and lazily written INT 13h), this gave only about 35% of the original hard disk PC XT (according to Checkit). Since I almost felt like a guru of VHDL and digital electronics in general, I immediately decided to write not a copy of the existing interface, but a module that provides packet transmission.

True, I decided not to bother with DMA (or, as they say here in Russia, RAP) - dMA controller there was not yet, and I did not want to take on two new modules at once, then you won’t figure out exactly where the problem is. The debugging of the module was not entirely smooth - I had to tinker a bit, including actively using the digital channels of the oscilloscope as a protocol analyzer. By the way, for some reason, during the whole process, I practically forgot that Quartus includes a built-in digital analyzer SignalTap, which would probably be even more convenient. Perhaps in the future my hands will reach him (I have never used it yet), but so far I really like to use a separate piece of hardware for this.

Probably, taking into account the new module, INT 13h could be rewritten more seriously, but I was lazy, and I got off with only the minimum necessary modification. As a result, we got a not very beautiful and completely ineffective pile-up, but all the same, the speed with the new module increased almost 5 times:

Then came the partly tedious, partly fascinating process of launching various programs (primarily gaming) in order to find out why they do not work (or rather, that my computer is not compatible enough). You can write a separate article about the search for reasons. big article, I'll just give a few examples:
- I don't have DMA. It turned out that DMA channel zero (used to regenerate memory on original PCs) is also used by some programs as a counter to determine short time intervals. I had to emulate the corresponding part of the DMA controller counters
- usually (but not always) when reading from a non-existent memory area or I / O port, an FF byte is read. I read the other way around - 00. This was not pleasant to the program, which checked in this way (and nothing else) the presence of the joystick, after which it decided that it was there and that all the buttons were pressed
- the most original way to determine the presence of a CGA adapter was used by a program that wrote a certain value to the cursor location register, then read the value and checked against what it was writing (then restored original meaning). According to the documentation I have, this register should seem to be write-only, but changed it to read / write, after which the program calmed down
- not related to my computer - spent a lot of time figuring out the reasons for the hang of the simplest old game Paratrooper. It turned out that although the game is old, but the file I had was compressed with a self-extracting com / exe file archiver. So, the part that was later responsible for unpacking the program at startup contained a command that appeared only starting from the 286 processor. The trouble was that this command did not greatly affect the unpacking process and only corrupted some bytes (less than one in a thousand). Perhaps, I spent the most time on these showdowns.

So, little by little, almost all the games that I had started to launch and work without any problems, I even tried to play some of them:

During the launch of numerous games, it turned out that the timer module I had was far from perfect - in most cases the sounds were not entirely correct. Having decided that I would still want to deal with the Wishbone bus, I decided to screw the timer to VHDL, which I mentioned earlier. To begin with, I read the Wishbone description and put together something like an adapter between the Wishbone interface and the 8088 bus - nothing complicated. Unfortunately, the timer didn't work. I had to take out the oscilloscope again and see what was happening there (first of all, whether the Wishbone signals were formed correctly).

Who would have thought that at this moment a great discovery would await me ... Do you remember how I suffered with memory failures, and had to enter an intermediate register, the need for which I did not see in principle? So, on the oscilloscope screen, I got the following picture:

Naturally, the first thing that caught my eye was the terrible ringing of signal 2. Moreover, this ringing passed from a quantitative parameter to a qualitative one. Signal 6 is formed by a one-digit counter, to the input of which signal 2 is applied. In fact, on each rising edge of signal 2, signal 6 is inverted. But the oscillogram shows that signal 6 switched once not only on the normal front of signal 2, but on the front of the strongest "ringing"! Those. in my circuit, on some lines the ringing was of such an amplitude that it could cause false switching of the logic. To say that I'm crazy is to say nothing. I couldn't even believe that with all this I managed to achieve stable operation of the circuit ...

Further, after a small analysis of the circuit, taking into account the new data, it became completely clear to me where exactly the old failures arose, and why that register cured them. Nevertheless, I had to do something, since it was signal 2 that I needed to work with the new timer module. And again the traditional appeal to experts. From several tips on the forum, an option was chosen with cutting the track and soldering a resistor there. The result was far from ideal, but I did not record more false switches from ringing when testing for several hours:

Unfortunately, this did not affect the performance of the VHDL timer module - it was silent. After tinkering for some more time, the reason was discovered in a rather unexpected place - in the module itself. Moreover, it was quite prosaic (and often found in programming) - the module incorrectly processed one of the extreme values, namely, with a divisor of 0, instead of dividing by maximum value (65536) did nothing. I checked all the time exactly the initialization of channel 0, which is initialized with the maximum divider to get a frequency of 18.2 Hz. When I used the FFFF divisor for the experiment, everything worked.

I even contacted the author of the module, who (the author) had already forgotten that he wrote this module. Nevertheless, the author helped me find the specific place where the mistake was made, and I even tried to fix the error somehow. It was this problem that was solved, but others were found, so for now I settled on the first version of the module, on Verilog.

At this point, the readiness of my design was such that I was ripe for the main experiment. The fact is that back in 86 I read an article from the magazine "In the world of science", which is a Russian translation of the American magazine "Scientific American", which told about the newest product of Microsoft - namely, the game MS Flight Simulator. Considering that already at that time I was a fan of computers, but at the same time was determined to become a pilot, you can understand what emotions then seethed in my head (and in other parts of my body).

And now, almost 30 years later, I have an insatiable desire to run exactly that historical Flight Simulator on my computer. Interest was also fueled by the fact that in those days two programs were almost officially used for compatibility testing - the very Flight Simulator, as well as Lotus 1-2-3. It was said that they used the hardware features of the computer so tightly that if these programs worked, then everything else would work even more.

In general, I had some doubts - I still knew about some pitfalls in my design, but still decided to take a risk (especially considering that, of course, I did not risk anything). Result on screen:

By the way, the mysterious graininess of the picture at first aroused my suspicion - I immediately began to think about some very clever way of working with the video adapter, which is not supported by me. In fact, as it turned out, in this way Microsoft was trying to get additional colors by combining points from the available colors. I must note that, given the 320x200 resolution, the result was, to put it mildly, dubious.

There were no problems with the launch of Lotus 1-2-3 either, so that the experiment could be considered over. Nevertheless, I carried out a number of small tweaks and tweaks, after which all the programs that I have at the moment began to start and work absolutely normally. The only one new functionwhich I added after that was EMS. I just haunted that more than a megabyte of available memory wasted (to be honest, I just wanted to do something else), so I found a description of the EMS board with a driver, and wrote a module that emulates the operation of this board. The driver successfully recognized the memory:

The final touch was the rework of the processor board itself. I didn't like the waveform nightmare at all and wanted to practice with the Eagle again. As a result, a 4-layer printed circuit board, in which one of the inner layers was dedicated underground, the second - for both supply voltages. In addition, the most significant point was the elimination of loops - the connectors are installed so that my board is directly plugged into the FPGA debug board (to be absolutely precise, into the GPIO port expansion board of the FPGA debug board - such is a nesting doll):

There were also some circuitry changes - the 8284 clock sequence generator was completely removed (I decided that it was possible to remove it without any problems inside the FPGA, without causing the slightest damage to the compatibility of the bus signals) and the latch register on the address / data bus (also removed inside the FPGA). Quick check waveforms on new board showed that the signals became almost perfect:

So, the path from a blinking LED on a solderless breadboard to a completely normal computer was completed in a couple of months, with a huge amount of fun, as well as knowledge in a number of areas. The result was a computer with pretty good compatibility with the IBM PC, on which all programs that I was not too lazy to get work without any remarks, incl. and those that are considered extremely demanding on hardware compatibility. The computer uses BIOS version 3 from the IBM PC almost completely (except for the INT 13h handler).

It is almost impossible to say anything definite about the project budget. To begin with, what to include there - only a few microcircuits (assuming that the installation can be done by MGTF, the FPGA board and devices for tuning are already there), or everything, starting from the extra-urgent manufacture of boards, buying an FPGA debug board specifically for this project and ending with not the cheapest oscilloscope?

I kind of indicated the specific types of microcircuits and everything else in the article, so that anyone who wants to can see what all this will cost in his version. Naturally, it is not necessary to use exactly DE2-115, for a reference point I give the required FPGA resources:

It should be noted that there are still a bunch of artifacts used for debugging here, and the code itself has practically not been optimized.

What to do with all this (and whether to do anything at all) is not entirely sure. During the process, it became clear once again that, although something can be achieved with enthusiasm and erudition, formal knowledge of the basics would make it possible to speed up everything, avoid many rakes, and most importantly, concentrate more on creativity, rather than inventing a bicycle with square wheels. Therefore, while there is a great desire to fill the gaps (or rather, gaping holes) in the knowledge of just the basics of electronics and circuitry in general, and VHDL in particular, by some express method. How much this will turn out, we'll see - there is always a problem in motivation and the availability of free time.

It should not be forgotten that although the IBM PC is compatible computers and are the most popular, occupying the lion's share of the market, there are cosplayers and are dynamically developing in which there are no x86 processors. In particular, computers that are not compatible with the IBM PC - laptops and personal digital assistants (PDAs) with processors developed by Motorola and IBM, Playstation brand game consoles, have a completely different internal architecture and are assembled on chips that are developed specifically for them. Although outwardly, for example, it is almost impossible to distinguish a laptop on an Intel processor from an Apple branded laptop, which uses a Motorola processor.

In addition, mention should be made of game console Playstation 3, which appeared in large quantities in the fall of 2007. Its design uses a 9-core Cell processor developed by IBM Corporation. With a modest price and dimensions, its ability to create a virtual world on a monitor or TV screen is much higher than that of the most sophisticated personal computers with x86 processors.

Structural scheme microprocessor

The block diagram of the basic model of the microprocessor is shown in Fig. 1.

Figure: 1. Block diagram of the microprocessor

The microprocessor can be conventionally divided into two parts: the Execution Unit (EU) and the Bus Interface Unit (BIU).

The executive block contains: arithmetic block and registers. The arithmetic unit includes an arithmetic logic unit, auxiliary registers for storing operands, and a flag register.

Eight registers of the MP execution unit (AX, BX, CX, DX, SP, BP, SI, DI), having a length equal to the machine word, are divided into two groups. The first group consists of general purpose registers: AX, BX, CX and DX, each of which is a register pair composed of two registers 0.5 machine word long.

The accumulator, or register AX, consists of registers AH and AL. Base Register BX consists of BH and BL registers. The CX Count Register includes the CH and CL registers. The Data Register DX contains the DH and DL registers. Each of the short registers can be used alone or as part of a register pair. Conventional names (accumulator, base register, counter, data register) do not restrict the use of these registers - these names speak about their most frequent use or about the peculiarities of using one or another register in a particular command.



The second group consists of address registers SP, BP, SI and DI (in older models the number of address registers is increased). These registers are actively used for their functional purpose and for other purposes they are not recommended. Their main purpose is to store numeric values \u200b\u200bthat are realized when forming addresses of operands.

The device for interfacing with the system backbone contains control registers, a command pipeline, ALU commands, a control device for the MP executive unit and a memory interface (connecting the internal backbone of the MP with the computer system backbone).

The BIU control registers: CS (command segment pointer), DS data segment pointer), SS (stack segment pointer), ES (additional segment pointer), etc. are used to determine the physical addresses of the OP - operands and commands. The IP (Instruction Pointer) register is the address of the instruction that will be selected into the instruction pipeline as the next instruction (in the domestic literature, such a device is called an instruction counter). The pipeline of MP commands stores several commands, which allows, when executing linear programs, to combine the preparation of the next command with the execution of the current one.

The flags register also belongs to the control registers of the MP, each bit of which has a strictly defined purpose. Usually, the bits of the flags register are set by hardware when performing the next operation, depending on the result obtained in the ALU. In this case, such properties of the result obtained as a zero result, negative number, overflow of the ALU bit grid, etc. are recorded. But some bits of the flags register can be set by special commands. Some bits have a purely service purpose (for example, they store a bit dropped from the ALU during the shift, or are reserved (i.e., not used).

So why are modern desktop computers (with the exception of Apple computers) still called IBM-compatible, although the proportion of PCs made directly by IBM itself is quite small? The fact is that only IBM at the dawn of the production of personal computer equipment in its version of the computer declared the principle of "open architecture". This meant that IBM, unlike all other manufacturers, did not intend to make a secret of what exactly was inside its personal
computers, and most importantly - it openly encouraged other firms to both produce components for IBM-compatible computers, and to produce exactly the same computers, which from that moment became known as IBM-compatible

It was thanks to this policy that IBM-compatible computers tightly occupied the market, completely displacing all competitors at that time, of which there were many: many
quite by accident, firms produced their own personal computers, the architecture of which was completely closed - Commodore, Olivetti ...

However, the paradox was that, having presented the world with IBM-compatible computers, the company itself quickly lost its leading position in their production. Using the completely open and well-documented architecture of these machines, various manufacturers began to release their own modifications, which were often much better than the IBM models, as a result of which such well-known companies as Compaq, Hewlett Packard, Acer, Dell and others entered the arena.

In addition, the openness of the architecture led to the emergence of the so-called Noname (nameless) computers, which, like a toy computer, were composed of components from completely different, not very well-known manufacturers. However, it should be noted that almost all of the so-called brand-name computers are assembled from components from various companies. And the task of the company that sets its own brand is to ensure the selection of these components and high quality control.

There is, however, one and only completely separate type of personal computer that is not IBM-compatible. These are computers from Apple, which only it produces. Apple computers are often used as personal machines, but their main purpose is publishing and printing.




IBM is a large corporation today developing and supplying software and other high-tech products. Over its more than 100-year history, it has brought many new products to the market. It was thanks to IBM that computers appeared in almost every home.

Start

IBM appeared at a time when the personal computer was difficult to imagine. In 1896 it was founded by the name of the company then received TMC and was engaged in the production of calculating machines, which were sold mainly to government organizations.

At the beginning of its history, the company received a huge order from the Ministry of Statistics, and thanks to this it immediately took a significant position in the market. However, due to health problems, the founder and owner still had to sell the company to the famous financial genius Charles Flint. The millionaire paid a whopping $ 2.3 billion for the company at the time.

The emergence of IBM

After taking control of TMC, Charles Flint immediately began merging with other assets such as ITRC and CSC. As a result, the prototype of the modern "blue giant" was created - the CTR corporation.

The formed company started producing a wide variety of equipment corresponding to that time. Among them were scales, time tracking systems and, most importantly, punch card equipment. It was the latter that played a big role in the company's transition to the production of computers.

The IBM brand first appeared in the Canadian market in 1917. This is how the company decided to show that it had become an international corporation. After the sufficient success of the new name, the American division also changed its name to IBM in 1924.

Over the next several years, the company actively continues to improve its own technologies, creating a new type of punched cards called the IBM Card. Also, the corporation again gains access to large government orders, which allows it to practically not carry out reductions even during the Great Depression.

IBM and World War II

The IBM company actively cooperated with the fascist regime in Germany. In 1933, after the corporation even launched its own plant in Germany. However, the company, like most other American firms, only announces the sale of cars and does not consider this to be support for the regime.

On the territory of the United States during the war years, the corporation was mainly engaged in supplying the front on a government order. She was engaged in the production of sights for throwing bombs, rifles, engine parts and other items necessary for the military. At the same time, the head of the corporation then set a nominal profit of 1%, which was sent not to shareholders, but to the needs of aid funds.

The beginning of the era of computers

The first IBM computer was released in 1941-1943 and was named "Mark-I". The car weighed an impressive 4.5 tons. After testing, its official launch took place only in 1944, after being transferred to Harvard University.

In fact, "Mark-I" was a very much improved adding machine, but due to its automation and programmability, it is the first electronic computer.

The collaboration between the international corporation and the main developer was extremely unsuccessful. IBM computers continued to develop without him. As a result, in 1952, the company released the first tube computer.

In the late 1950s, the first transistor-based IBM computers were created. It is thanks to this improvement that the reliability has been increased. computing machines and create on their basis the first missile defense system. At the same time, the first mass-produced IBM computer with hard disk... True, the drive, shown to the Soviet leader in 1958, occupied two large cabinets and was 5 MB in size. IBM set prices for it, too, rather big. The first hard drive prototype cost about $ 50,000 at the prices of the time. But that was only the beginning.

First appearance of the IBM System

In 1964, new IBM computers were introduced. They have changed significantly and set the standard for many years to come. The family was named IBM System / 360. These were the first machines that allowed for a gradual increase in computing power by changing the model without changing the software. It was in these mainframes that microcode technology was first introduced.

The computers created by IBM received a very successful architecture that became the de facto standard for many years. And today the System Z series, which is a logical continuation of the System / 360 line, is very actively used.

First PC

IBM did not see personal computers as a promising market. However, in 1976, the first desktop computer of the IBM 5100 series was introduced. It was intended more for engineers and was not suitable for office work or personal use.

The first mass personal computer "blue giant" presented only in 1981. In fact, the company did not really hope for his success. That is why most of its components were purchased from other companies. The new computer was included in the IBM 5150 family and received the name PC.

The popularity of the IBM PC

A new processor from Intel demanded and which was very successfully proposed by a young company founded by Bill Gates.

The biggest factor that made the PC popular was its open architecture. For the first time, the corporation abandoned the long-term principles and did not license the used components or BIOS. This allowed many third-party companies to quickly build "clones" based on the published specifications.

The open architecture provided other advantages, such as the ability to repair and self-upgrade computers. Subsequently, this gave rise to the development of personal computers.

However, IBM itself practically did not hit the home computer market. The original IBM PC was quite expensive. In addition to this basic kit, it was required to purchase a floppy disk controller and the drives themselves. The competitors looked more promising against this background.

Nevertheless, the company tried to launch a number of models for home users as well. One of them, called the IBM PCjr, was ranked among the 25 worst computing devices. But the production of this model was quickly discontinued.

In the business segment, IBM traditionally felt excellent, including in the personal computer market. This was achieved by high brand awareness and well-thought-out marketing. Success has resulted in the IBM PC / XT and IBM PC / AT.

First laptop

Despite the rather bad initial attitude towards personal computers, the giant was forced to think. First of all, this was influenced by the overwhelming success of the IBM PC. By the way, the six-month sales target for the first personal computer was completed in less than 30 days.

The IBM Convertible went on sale in early 1986 and, despite its rather modest characteristics, was produced until 1991. Among the innovations, this device was the first PC from the giant corporation equipped with a 3.5 ”floppy drive.

90s

By the 90s, the giant corporation was rapidly losing its position in the personal computer market, but for a long time it continued to produce new models of stationary and mobile computers.

First, in 1990, IBM introduced a new computer to the market that had a completely new architecture and was incompatible in hardware and software with previous generations.

The new computer received a modern data transmission bus, and many of the components were changed in such a way that it was almost impossible to reproduce them by small companies from Asia for technological and licensing reasons. But the architecture was a failure. Although some of the innovations used in these PCs have been around for a long time, for example, PS / 2 mouse and keyboard connectors are sometimes used even in modern machines.

At the same time, the company produced a series of computers compatible with the previous generation called PS / 1, and later - Aptiva.

These were the last personal computers produced by the blue giant. By 1996-1997, the production of cars for this market segment was phased out.

2000s and the final exit from the PC market

IBM, despite the termination of the development and production of desktop PCs, continued to produce and quite successfully sell on the market laptops. Some users even continued to regard IBM computers as benchmarks.

In 2004, the corporation made a difficult decision, as a result of which the entire business of manufacturing personal computers and laptops was sold to the Chinese company Lenovo. The company itself has focused on a much more interesting server and support market for the giant. Later, IBM sold other divisions that linked it to the production of PCs, for example, the division that produced hard drives came under the control of HITACHI.

The long history of IBM has allowed the company to accumulate vast experience in the creation of computer hardware and software. Today, even despite the withdrawal from the PC market, the company has a fairly strong influence on the development of the entire industry.

The first step is to accurately determine the tasks for which your future system unit will be used. If you are planning to buy gaming equipment, then special attention should be paid to the video card, and for the graphic workstation processor power and volume play a fundamental role random access memory... The least demanding in terms of performance are office systems. You don't even need to add an external video card, because the built-in one will be enough. First you need to choose a processor. This element affects the overall performance of the entire system and the more cores (and the higher their frequency), the faster the operations will be performed.

Next, the PC configurator will help you choose the motherboard. It must be compatible with the CPU and support the required frequency of RAM. Pay attention to the presence of all the necessary slots and connectors, as well as the size of the motherboard (ATX, micro ATX, mini ATX, etc.). Usually, any of them already has a built-in network and sound card... The website builder of the online store will automatically select the appropriate options after you select the processor, and not exclude the suitable ones. The gaming computer must be equipped with an external video card. If you want to regularly play modern games and forget about upgrading your system for a long time, then you should not save. This also applies to the amount of RAM, it will not particularly affect the cost of a PC, but it will significantly affect performance. The amount of information that you can simultaneously store on your computer depends on the size of your hard disk. But to increase the system performance, it is recommended to additionally install an SSD drive. It will contain the OS, programs and applications.

For comfortable work with external data carriers, the system unit is optionally equipped with optical drive and a card reader. One of the important elements of the system unit is the power supply. Its power should be selected after calculating the total volume of electricity consumption by components. In addition, leave a headroom of 100-200 watts for reliable operation under increased loads on the processor and video card. The designer will not let you make a mistake with the choice of the power supply, since it will take into account the components you have chosen and provide only suitable cases with power supplies.

Powerful configuration gaming computer provides for additional system cooling, which is selected automatically, depending on the selected processor. It remains to collect everything in the case. It can be very simple and straightforward if you plan to install the system unit under the table, where no one will see it, or have neon lights and a window on the side, allowing you to monitor the operation of the system (gaming options). This is a matter of taste, but keep in mind that a gaming PC case should be roomy and have good airflow so that the components do not overheat at peak loads.

Having difficulties?

For the convenience of customers, it is possible to send the resulting configuration for printing. And if you have any difficulties, then you should use the help of our engineer, who will tell you which components are more correct to use in order to obtain optimal technical characteristics.
By deciding to assemble a computer from us, you get the best prices and service. We guarantee fast but careful delivery of your system unit.

Did you like the article? To share with friends: