My design criteria was to use off-the-shelf IC’s, attempting to stay as close to the original 1980’s technology. A few DIP IC’s mixed with a few SMT parts to trim the PCB area down as small as possible, thereby cutting PCB fabrication costs.
The original design was a basic Z80 SBC with 32KB of EEPROM (AT28C256) mapped in at 0x0000 to 0x00FF (for boot code) and 0xE000 to 0xFFFF (utils, monitor ROM and BASIC) with 56KB of RAM (2 @ CY62256 or the like) mapped from ox0100 to 0xDFFF. I was planning to put a simple CP/M compatible bootloader in the first 256 of ROM, which would be able to boot CP/M from an SD card. The SD card interface would be emulated with an AVR. Since I have a few, I was planning to use an 82C51A UART and use an 82C55 as a simple parallel I/O device. To simplify and reduce “glue-logic” gate count, I would dust off my CUPL skills from a decade or so ago and make use of some LATTICE GAL20V8Z SPLD’s I still had from when I worked in the industry as a disti-FAE. This initial design was designated V1.00.
As I got to thinking more about how I wanted to merge the project with an AVR, I started to think that it would be much easier, more efficient and provide some luxuries by using the AVR as a system controller, which would have access to the Z80’s Adress, Data and Control lines. I could then do away with the ROM and pre-load the RAM with a bootloader, Monitor, BASIC, etc. at power-up or user-initiated RESET. Using one of the AVR’s 8-bit timers, I could use the OCxA (or OCxB) output to directly drive the Z80’s clock and peripherals through a buffer. Maximum speed would be the AVR’s system clock divided by two. I would have control of the Z80 reset line as well. Hence the V1.10 (z80_computer_1v10 PDF) design was born.
The AVR of my choice is the ATMEL AT90USB1286 with it’s built-in USB controller in hardware and many I/O ports, most being 8-bits wide, which would work well for the byte-wide address and data buses. With the AT90-USB1286, to talk with the system controller, I could use a simple Virtual COM port, which is directly supported by the Linux and Windows OS’s. That would also save me from having to use a serial-to-USB adapter (FTDI, CP2102 or the like) with the 82C51A as I could feed the 82C51A’s TX and RX pins to the AVR’s TXD and RXD pins and use the AVR as a serial pass-through device. I could use one of the AVR’s 16-bit timers to generate the baud rate clock for the 82C51A as well. Using Timer 1 of the AT90USB1286, I could divide the AVR’s clock down to 122Hz. With 16-bit resolution, I can effectively support ANY desired “industry standard” baud rate. At 115.2K, 57.6K, 38.4K, 19.2K and 9600 baud, the calculated error is 0.08%, which is well within the acceptable tolerance. That’s pretty darn good! With the AVR as a pass-through, I could monitor the serial input stream for a certain control character to signal it to take control of the serial I/O and allow me to talk to the AVR system controller for “house-keeping” functions and utilities, like single-stepping, code upload and download, setting the Z80 clock speed, etc.
A low-cost AT90USB1286 development board would be nice to work with and I already have a few of PJRC’s TEENSY 2.0++ boards, which come in a 40+ pin quasi-DIP “package”, so that’s the plan. On the TEENSY 2.0++ board, port F of the AT90USB1286 is fully accessible, which allows me access to the JTAG interface for hardware debugging of my AVR code when I get to that part of the design.
Since I was planning to use an AVR to emulate an SD card as a “disk drive”, it seemed that I might as well try to also emulate the 82C51A or as on the RC2014 project, an MC6850 UART, then I could do away with another larg 28-pin 600 mil DIP IC and the baud rate generator circuit. If I emulate the MC6850 UART, then I can use some of the RC2014 code unmodified as that project makes use of the MC6850 UART. If I wanted to, I could emulate the Z80-SIO device as well, which would make more of the Z80 code out there on the net” usable with little modification. Among some of my goals, was to keep the PCB size small to save on fabrication costs. Thus, using the AVR as a UART, SDmem interface and to provide the Z80 clock seemed to be the correct path to follow as it will eliminate at least 3 existing IC’s and some passive components as well.
Although I currently see no need to use them, in the V1.10 design, I have left the ADC0 through ADC3 pins (PF0 to PF3) available in case I decide that I need to monitor an analog voltage for some reason. The AVR’s TXD and RXD, INT0 and INT1 pins (PD0 to PD3) are also left unconnected thus far. I wanted to leave the INT0 and INT1 pins available in case I need to awake the AVR from SLEEP mode or just respond to a “chip select” for the DISK and UART emulation using the “UART” and “DISK” chip selects.
As for the Z80 system clock, the OC2A output of the AVR provides the Z80 clock. Timer 2 is an 8-bit timer. If the code is set up such that the AVR clock is the clock source to the timer and “mode 2” (CTC mode) is selected, the OC2A pin can be programmed to toggle with an OCR2A match. Thus, the OC2A output will toggle at a rate dependent of the the AVR clock source. If OCR2A is set to “0”, there is a divide-by-2 clock for the Z80. The TEENSY 2.00++ clock is 16MHz, which means the fastest my Z80 could run would be 8MHz but setting OCR2A to “1” would drop that to 4MHz. The output frequency would be AVR_CLK / ((OCR2A + 1) * 2). The minimum Z80 clock could be 16MHz/((255+1)*2) or 31.250KHz. As one might imagine, the Z80 clock could even be set to something “odd”, say 3.2MHz, 2.666MHz, 1.454MHz, etc. The Z80 clock could be changed on-the-fly, even used to suspend the Z80 in a “standby mode” by forcing the Z80 clock pin low. I plan to implement “single-stepping”, which is why running the Z80 at a low frequency of sub-100KHz would be useful as it would give the AVR time to monitor and toggle the appropriate Z80 control pins.
As stated earlier, I have the intention to use the AVR to “download” a “monitor” or CP/M bootloader into the RAM starting at address 0x0000. Since the AVR controls the Z80 RESET pin, the Z80 can be held in a RESET state until the memory dump is complete, then the RESET pin released so the Z80 can continue to execute code from 0x000.
Next up, getting the SPLD to function as an I/O decoder.