Friday, August 14, 2015

A2 Serial Video to HDMI Converter - BBBlack

While the A2 Video Streamer boards were away at KansasFest I figured I would try and answer a question I had been given and had pondered over for while. Is it possible to use the Raspberry Pi as a microcontroller instead of the FX2LP board and have it display HDMI? There are two main problems to overcome if using the RPi directly. The first is GPIO access speeds. To sample the A2 video stream you need at least twice the video frequency ie 28MHz. From looking around on the web this seems to be borderline capable. The other problem is that if using the Linux operating system which is not time deterministic then there is a good probability that you are going to miss samples especially when using sampling speeds in the MHz range. This results in a jittery picture. There is always the option of using a real time operating system on the RPi but is it going to have HDMI drivers for displaying the processed result? The other option which is more likely to work is sampling the data using an external device say the FX2LP and feeding the data through the USB port. In a similar project the BitScope people chose to do it that way ( That is the easiest way I can think of doing it. If using the direct method you would probably use a bus switch and a level shifter between the RPi and the external device anyway so how much are you really loosing by having an FX2LP do the job? Other groups have tried using different options like the high speed camera port but they have also come across limitations and have had to revert back to using the USB port (

So what is the answer? Yes, it can be done but possibly not as directly or easily as one would have hoped for. The other answer is not to use a RPi at all but instead go with a BeagleBone Black.

The BBBlack is a credit card single board computer that is similar in concept to the RPi. Where the RPi is a personal computer reduced in size the BBBlack is more like a hotted up microcontroller. The BBBlack's video capabilities are not as good as the RPi's but it's the input/output capabilities that make it stand out. The CPU comes equipped with two high speed (200MHz) microcontrollers on chip called PRUs. These have been used to sample data up to 100MHz ( If I was going to get a HDMI solution working, my money would be on the BBBlack.

That is exactly what I did. I was able to get an A2 serial video to HDMI converter working by using the real time capabilities of the PRU to sample data. I found that the CPU was not even needed, for a monochrome picture that is. The PRU can sample and dump the data directly into the frame buffer. The loop is tight with only 14 clock cycles available between sampled pixels but that's just enough to do monochrome video processing. For colour video the second PRU or the CPU would be needed. I suspect the CPU would definitely be needed if we wanted to render the output to look like these TV simulators or

Surprisingly I had more issues with the electronics on this project than the programming. It was recommended that any signals going into the BBBlack be suppressed while it was booting so a dual purpose device was needed, a bus switch for isolating the signals and a level shifter to convert the A2's 5V signals to the BBBlack's 3.3V inputs. At first I tried using a TXB0108 chip but the so called Auto-Direction Sensing capability caused havoc. Creating random garbage on the composite monitor and causing the IIc to reboot was not a good sign. I switched over to using the 74LVC245 chip which is great for level shifting at low frequencies but in the MHz range it passed the 5V straight through. I went back to using resistors for level shifting, like I had done on the A2 Video Streamer project, but left the 74LVC245 in there for signal isolation.

So is this the best way of making an A2 to HDMI video converter? I don't think so. Not unless you were going to build an all in one device where the BBBlack supplies multiple services such as video, serial, Ethernet and/or disk drive support to the A2. I have some more hardware on the way which hopefully can be used to build a simpler and even cheaper converter.

Digital to digital conversion is great. It gives you the option to view a prefect picture or view it rendered and made to look like the TV or monitor display that you remember. It puts the control in your hands for scaling and picture positioning on the screen. It should be cheap and relatively easy to build since there is no complex filtering or approximations to adjust and better yet it's not going to matter if you have an NTSC or PAL motherboard.

A quick video demonstration can be found here

Here are a few zoomed in photos comparing composite and HDMI. The photos don't do it justice. The HDMI picture looks much sharper on the screen than in the photos. It's also the case that the photos don't show how bad composite really is. In the photo you don't get to see the shimmer around the text next to the cursor and the fact that sometimes it can't sync correctly and displays random colours on a monochrome picture.

A cape was built to make it more user friendly. The application and source files can be found here

Monday, July 20, 2015

A2 Video Streamer - Update

I knocked up a few Veroboard adapters to make the connections look neater. Initially the Apple IIe version was wired up in the same way as the IIc version (but using the appropriate auxiliary bus pins) however I added a piggyback slot to the board after being annoyed with loosing the 80 column / double high-resolution capability. Screen scaling was fixed and so was the double high-resolution page shift. I also did up some schematics and documentation so that Michael could take the boards with him to this year's KansasFest.

Thursday, May 21, 2015

A2 Video Streamer - Colour

Just before OzKFest 2015 I had a quick go at adding colour to the A2 Video Streamer. It was obvious that it needed some work. So began my investigation. I started with double high resolution graphics because once this was working all the other graphics modes would work as well. This is an exercise  in TV/monitor emulation and the TV/monitor does not know anything about the Apple II's video modes. It just processes the signals as it sees them. Get the lowest common denominator right and the rest will follow.

A. Monochrome.                            B. Colour block per four pixels.     C. Cross A with B.

D. AppleWin version.

The above pictures show my first attempts at colour and the AppleWin version to aim for. Actually when using the IIc with a colour adapter and watching it on my 14inch Sony monitor I get a picture that is somewhere in between screen shots C and D. However for an LCD solution the AppleWin version should be adequate for now. The video signal clock is approximately 14MHz. It's different for PAL than for NTSC but the important thing is that the colour reference is always a quarter of the video signal clock. This gives us four pixels per colour (0000 for Black, 0001 for Red, 0010 for Brown etc.) which equates to 16 colours. Now if we cut up the monochrome picture, screen shot A, into four pixels per block and apply the colours to each block we end up with screen shot B. If we do the same except displaying the colour only where the signal is white we end up with screen shot C. The double high resolution of 560x191 in monochrome mode is reduced to 140x192 when colour is introduced. The Apple IIc and IIe reference manuals call this the effective resolution but they just leave it at that. However when you look at screen shot B you can see that it's very blocky and comparing that to the actual display, screen shot D, you see that there is more to it.

If we zoom into the bottom left corner of these screens. We can see how different they are. I have drawn circles showing how the AppleWin version is made up. In the middle circle we can see that the green and yellow colour is filled in as in screen shot B. The right circle shows how the blue is left thin like in screen shot C. The left circle shows how some colours can be transformed into other colours, which is unlike B or C. This evaluation does not show a clear relationship between the colours so more investigation was needed. Generation of these displays is a common problem among computer emulator developers so where better to start than the emulator developer forums.

From the information read I still did not have get a clear picture of how I was going to work out the colour relationship. Without having to trawl through emulator code or work out mathematical equations from first principles I wondered if there was a simpler way.

From Issue 89 of the Compute magazine, October 1987, I was able to get an article about Chrome which is "Double Hi-Res Graphics Commands For Applesoft" and subsequently the disk image from ASIMOV which relates to this article. This allowed me to generate a few test patterns in AppleWin. The test pattern for two consecutive blocks is inadequate in explaining the colour relationship but the test pattern for three consecutive blocks is spot on. Bingo. Every block of four pixels can be determined by looking at the previous block and looking at the next block. I was able to extract this information and place it into an array of 4096 elements. This is our lookup table. In the application I have optimised this lookup table for storage but I expand it to 4096 elements during the initialization routine. This allows simple and fast processing.

The above example is taken from screen shot B and how it is used to produce the colour as shown by the left most circle in screen shot D. In step 1 we lookup the colour blocks Black, Black and Magenta. This gives us 0000 (this is a hexadecimal number ranging from 0000 to FFFF. It represents the 16 Apple II colours for each of the four pixels) which we then colour in as Black, Black, Black and Black. In step two we lookup the value for Black, Magenta and White which returns 000D and we colour the next four pixels as Black, Black, Black and Light Blue. In step 3 we lookup Magenta, White and Dark Green which returns FFF7 and results in White, White, White and Yellow. The process goes on to complete the line and page. The result is a picture as follows.

Therefore with one extra lookup table and a bit more code we get colour output. I'm happy with the result and stoked with having achieved it with such a small amount of effort. Emulation of analog equipment such as monitors or televisions can get quite complex and requires a lot of computational power to pull off. Just check out the video rendering section of this Apple II emulator - OpenEmulator I may have to revisit rendering when it comes to optimizing a full screen display but that's something to worry about for another day.

Sunday, April 19, 2015

A2 Video Streamer - Prototype


Whether it's a purchase of a computer without a monitor or that last composite monitor that bites the dust or changing screens as a space saving exercise, every vintage computer enthusiast will have to deal with the display issue at some point in time. As screen technology has progressed, video adapters have been built to take up the slack from dwindling numbers of original display monitors. Trying to adapt the old computer technology to new devices and produce respectable results has been a challenge for many enthusiasts. As a monitor replacement solution I believe that it's the tablet market that holds the greatest potential going forward. Devices such as the iPad with their well proportioned screen sizes and high resolutions are ripe for image manipulation and can produce fantastic results.

The A2 Video Streamer is an alternate way to display Apple IIc or IIe video. It's made up of hardware that samples and sends video signals over high speed USB and software which runs on a computer (including tablets) and interprets those signals to produce a perfect display, at the full frame rate, in an application window. All this can be achieved with low cost, readily available parts and a page full of control code.

The aim of this project was to obtain a portable display for my Apple IIc. I wanted to turn a luggable solution into a portable one. At first I was considering a hardware solution, like a composite panel however I could not find any information on anything that looked reasonable and I figured that there was a good probability that after purchasing several panels I still would not have been happy with the end result. This was way before seeing what Quinn and Javier had done with their systems. I considered other options too like the original IIc flat panels, DVD players and USB streamers. Nothing provided the results that I was after. I had an iPad sitting on the table and thought that it would make a great monitor due to its size and high resolution. I thought that it would just be an attachment that I needed but after looking into it more deeply I was shocked to find that it had no support for live video input.

Some further investigation turned up details on WiFi streaming and USB streaming. WiFi streaming looked like quite a bit of work to develop and I suspected that the lag would be quite bad. Considering that Apple II's are now mainly used for games the lag time is critical so I turned my attention to USB streaming. There weren't many solutions in this space but I did come across two examples that were implemented so I knew it that technically it could be achieved.

I wanted to see what signals I could get out of the IIc. I remembered how simple the circuit was on the IIc video port to component converter when I was looking into getting colour on a PAL Apple IIc. Compare the two schematics above. It looks so much simpler not having to deal with composite video. I set about to understand the IIc serial video signals.

Using an oscilloscope I checked out the signals and determined that they were ok for digital sampling. I was surprised to see the 14Mhz (approx) being displayed as a saw toothed signal. The trigger point could have been an issue but at least the error would have been consistent and worst case only be out by one clock cycle. It turned out to be just fine. On went the logic analyser and I tested the video to see if I could see the "HELLO" text as displayed on the monitor.

The sync signal was the first thing I checked. It turned out that the WNDW signal was simpler to deal with and so I didn't need the SYNC signal after all. Zooming in on the waveforms gave me a bunch of pulses at the top of the sync and then a bunch at the bottom. This corresponded with the text at the top of the page and the flashing cursor at the bottom. I zoomed in again on the first group and from the pattern I could instantly recognize it as the top row of the "HELLO" text. Some of the pulses looked thicker than they should have been but that was because the logic analyser did not have a sampling frequency of 14MHz or multiple there of. This will not be an issue once the Apple II video clock is used as a timing reference.

Once I knew that it was possible to get the video signal out of the IIc I had to find a way of displaying them on a screen at the other end. I needed a package that was portable and OpenGL fitted that very nicely. I became familiar with several methods of displaying test patterns on the screen of my PC. There seems to be a lot of OpenGL setup code needed just to get a simple 560x192 picture on the screen however it will come into it's own later when scaling and rendering the image.

The next step was to get the data to the PC. Since the video out of the IIc streams at 14MHz a microcontroller was needed with high speed USB capability. The Cypress EZ-USB FX2LP microcontroller (CY7C68013A) was the best choice, not only because it is an amazing bit of kit but compared to the other high speed USB capable microcontrollers the FX2LP is well documented and there are plenty of examples from the manufacturer and of personal projects. Because it's so versatile and flexible it can be quite daunting at first. I obtained the FX2LP chip via a development board from ebay for a few bucks. It's basically just a breakout board for the microcontroller and an EEPROM added for firmware storage (I use firmware loosely here). There are several ways of making the firmware operate. The program can be burnt into the EEPROM or another method can be used that makes this device awesome. By default the microcontroller comes with a very basic USB layer that allows an application to interrogate it and find out some simple details like the manufacturer name and device model number. This facility can be used to send it the firmware from the application. Imagine what this means for the developer. This means that the firmware does not have to be set in stone. It can be downloaded to the FX2LP every time the application starts up. This also means the FX2LP can be different devices to different applications. Say a video streamer today and maybe an Apple II joystick to USB connector for your favorite PC/Mac emulator tomorrow. I still haven't decided if I'm going to use the FX2LP in this way.

The easiest way for two systems of different speeds to communicate with each other is via a FIFO (First In First Out) buffer. This is another great feature of the FX2LP. The microcontroller has a FIFO buffer in parallel with the CPU. Once the CPU finishes setting up the USB communications it is no longer needed. That's right, when sending data over USB there is no CPU processing involved. Running this way is called 'Slave FIFO' mode and this is the method I have used for this project. This mode can be used because we are not processing the data on the FX2LP. There are other modes such as 'Ports' mode which is basically bit banging the Input/Output pins using the CPU and 'GPIF - General Programmable Interface' mode which is useful when communicating to more complex devices such as FPGAs, cameras, IDE drives etc and extra logic is needed.

Firmware: Keil uVision2               Tools                               Software: Microsoft Visual C++ 2008

There are many configuration options when setting up USB on the FX2LP so lets just say that if all the rules are not followed to the letter then communication just does not work.

Cypress provides tools for testing USB transfers but there is a lot of ground to cover before being able to use these tools with confidence. Most of the examples from the manufacturer use the Keil uVision programming environment but since the free versions are very limiting many people have converted over to open source environments. There were examples that I wanted to use from several projects but I didn't want to create multiple environments so I stuck it out with a limited free version of Keil uVision and hacked away until I got parts working that I needed. I experienced the same issue with the application side ie the Visual C++ 2008 Express Edition. Several Cypress examples that interested me did not compile with the free version. However, I wanted to stick with this compiler because I had the OpenGL working in it. Again code was hacked to get bits working.

There are two types of high speed USB transfers that can be used BULK and ISOCHRONOUS. Currently I have only been able to get BULK transfers working. This means that the application will work so long as the USB bus does not get overloaded. Keyboards and mice are not going to add too much extra traffic but add a few drives, a printer and a web camera and the BULK method will start to throttle back the available bandwidth for our application. This results in some interesting results like screen glitches. I plan to change to ISOCHRONOUS transfers at some point to guarantee a given amount of bandwidth at all times.

The FX2LP runs on 3.3V but even though the inputs and outputs are 5V tolerant and it will work without level shifting resistors it's probably safer to include them.

Sample format.

14MHz                  -> Clock

WNDW      Bit[0]   -> Line and page sync
SEROUT    Bit[1]   -> Data bit
GR            Bit[2]   -> Determines when to process colour ie mixed graphics modes
Spare        Bit[3]   -> Maybe for colour - CREF?
Spare        Bit[4]   -> Maybe for colour - TEXT?
Spare        Bit[5]   -> Maybe for colour - SEBG?
Spare        Bit[6]   -> Maybe for colour - LDPS?
Spare        Bit[7]   -> Spare

When the Window, OpenGL and USB code is taken out, the remaining code is quite short. The processing part is very simple.

    if (EndPt->FinishDataXfer(buffers[i], rLen, &inOvLap[i], contexts[i])) {
        BytesXferred += len;
        for (int iByte = 0; iByte < rLen; iByte++) {
            if (!bProcessData) {
                if (!(buffers[i][iByte] & 1)) {
                    if (iDisplayBlank > 8000) {
                        iDisplayY = 191;
                    else {
                        if (iDisplayY > 0)
                            iDisplayY -= 1;
//                    iDisplayX = -7; // 80 column setting.
                    iDisplayX = -13; // 40 column setting.
                    iDisplayBlank = 0;
                    bProcessData = TRUE;
                else {
                    iDisplayBlank += 1;
            if (bProcessData) {
                if (iDisplayX < 560) {
                    if (iDisplayX >= 0) {
                        if (!(buffers[i][iByte] & 2)) {
                            color[iDisplayY][iDisplayX][0] = 255;
                            color[iDisplayY][iDisplayX][1] = 255;
                            color[iDisplayY][iDisplayX][2] = 255;
                        else {
                            color[iDisplayY][iDisplayX][0] = 0;
                            color[iDisplayY][iDisplayX][1] = 0;
                            color[iDisplayY][iDisplayX][2] = 0;
                iDisplayX += 1;
                else {
                    bProcessData = FALSE;

Here are some results.

High Resolution Graphics and Double High Resolution Graphics

Notice how the double high resolution picture is shifted to left. This also occurs with 80 column text. At first I thought this was an issue with how I was processing the data but on further inspection I found the same thing was happening on my IIc monitor. Because there is a substantial black border around the picture it's not noticeable. It's only when you apply some tape to left hand margin and change from 40 columns to 80 columns you can see the missing 80 column character.

Monochrome text can be any colour including white, green or amber.

Colour display is still a work in progress.

Normal                                        50% Scan Lines
Will try and match something like AppleWin output.

Presentation was given at OzKFest 2015. Thanks Andrew for putting it up on SlideShare. You can find the link here
A quick video demonstration can be found here

Extra information :-

WNDW signal is good for synchronisation because it changes state just before the data begins but it can not be used to determine the end of line because it finishes before the data ends.

LDPS signal can be used to determine if 80 Columns/HGR or 40 Columns/GR is being used. Pulse is half the size for 80 Columns / HGR.

To determine which graphics mode is selected these signal lines are needed. Will this be needed for colour processing? I don't know yet.

Mode     TEXT      Mixed GR/TEXT       Mixed HGR/TEXT     HGR       DHGR
GR =      0           5/0                           5/0                           5           5
TEXT =   5           0/5                           0/5                           0           5*
SEBG =   pulse     5/5                           0/0                           0           0

* TEXT signal is high in DHGR mode.

Only few minor tasks still need doing. I've have them all done by next week. Only kidding.
  • Complete colour video processing.
  • Get ISOCHRONOUS transfers working.
  • Align 80 column text / DHGR with the other modes.
  • Build an adapter for the IIc video expansion port and one for the IIe expansion slot.
  • Scale up the size of picture and apply rendering to reduce pixelation.
  • Apply effects such as scan lines to resemble an original monitor.
  • For the USB library convert to using LIBUSB instead of CyUSB for greater portability. 
  • Implement an application on the Mac platform.
  • Implement applications on tablet platforms.
  • Options editor ie type of screen to emulate, RGB colour palette editor. 
Using SEROUT and accompanying signals is not difficult. I encourage you to give it a go. I’ve used this method to build a better streamer but I would like to see what others can do.

VGA cards like the Guimauve 2000 and Nishida Radio’s VGA options already process serial data. They use FPGAs to do the processing but what about using other devices such as a Raspberry Pi or a PSoC or a dedicated VGA chip?

I thought that video streaming to the iPad would take one to two days to set up. It has taken quite a bit longer and I’m only half way there.

It's still very much a work in progress however as a reference, of a working demo, here are the application and source files