Wednesday, August 30, 2023

A2VGA - Pico Cards - Colour Models - OpenEmulator Like Display




Are the photos above taken of an LCD or CRT monitor screen? Hard to tell? That's the challenge. The challenge of making the LCD display resemble that experience we had growing up with CRT technology. Is it authentic enough? Everyone is going to have a different opinion. I find it impressive, even though it's work in progress and there is always room for improvement.

There has been a substantial amount of work done in recent times in relation to Apple II video hardware. There were two video hardware based presentations at this year's KFest event ("VidHD-style HDMI solution with Ethernet-out bus card to Pi" by John Flanagan and "Yet Another Video Converter with Tang Nano 9K FPGA" by Rob Kim). The "AppleII-VGA" and "∀2 Analog" cards have been improved upon and used at the KFest solder session. The one that sparked my interest the most was ushicow's "Tang Nano 9K Apple II Card". You may have noticed his post in the KFest Discord environment. In the meantime, I've been making progress with my own solutions, the A2VGA IIe and IIc cards. This time it's been on the display quality side of things.

I've been playing around with Apple II colour models for several years and I've tried to understand how the colours relate to each other. When I say colour model, I mean converting the 560x192 monochrome video display (which at its core is what the Apple II really is) into the most visually pleasing coloured display. I started off by using digital models. By this I mean each resultant pixel is one of the sixteen possible Apple II colours. A sixteen colour palette is used to lookup the colour to be displayed. The AppleWin - Color (Composite Idealized) is one of these models and was the first model I implemented into my projects, back in 2015. Recently I've been trying to get my head around analog models. This is what I call a model where instead of picking a colour from a palette, mathematics is used to calculate the required colour. The full range of system colours can be used (A2VGA uses a 16bit / RGB565 colour system).



In this blog entry I'm not going to go into detail as to how composite video works. What is worth noting, however, is that colour on the Apple II revolves around a four pixel block. The reason for this has to do with the pixel clock frequency. When the Apple II requires colour to be output, it sends out a colorburst signal of approximately 3.58MHz. Colour is generated as a phase difference from this signal. If the Apple II's pixel clock was the same frequency as the colorburst then the "ON" part of that signal would output the entire colour spectrum in one bit. This is just a monochrome signal. If the Apple II's pixel clock was 3 times the colorburst frequency, then the colour spectrum would be divided into three pixels. This is basically red, green and blue. However, the Apple II's pixel clock is four times the colorburst, meaning that four pixels cover the colour spectrum. Each pixel is treated as being ninety degrees away from its neighbour. The "ON" or bright parts of the monochrome signal determine which parts of the colour spectrum are added together to form the resultant colour.

Colour spread can be adjusted by an aperture knob on a CRT monitor / TV. Turning the knob one way spreads the colour out giving you a nice uniform colour but the picture looks smooth and fuzzy. What this is doing is covering up the "OFF" or dark pixels (gaps of up to three pixels) that come from the monochrome picture. Turning the knob, the other way shortens the colour spread and results in a sharper looking picture. However, if you go too far you will start seeing these dark pixels gaps appear and these cause the picture to look stripy. This same affect can be achieved by using different models or changing the parameters that generate those models. You'll see this contrast in the models below.

The next part of this blog will show a comparison of these various models as best as I can. Please note that these are going to be photos taken of the display screen so it's not going to be the same as viewing these results on a monitor with one's own eyes. I've used a lot of zoomed in images to give a feel for the differences between the models. Some of these zoomed in images might look a bit trippy but that's not necessarily how they look when viewed on a whole screen.

I used the game Vindicator for the capturing of High Resolution Graphics (HGR) and Airheart for Double High Resolution Graphics (DHGR). If you have read any of my previous blogs, then you may have noticed that I use Airheart a lot for testing. For one, it runs in DHGR mode and DHGR is less forgiving than other Apple II modes. Problems tend to stand out more. I do use other Apple II video modes but only after I'm happy with the DHGR results first. Secondly, Airheart allows me to test all the points below in one go.

1. Display of the single width pixel and how colour spreads into the dark pixels. Look at the blue lines at the bottom of the screen. I like them being as thin as possible.
2. Uniformity of solid objects. Look at the red, green and blue squares at the bottom of the screen and the screen borders. This varies from a smooth even look to a sharp stripy look.
3. Moving white objects. The compass text as it spins about should only change the faint glow around the font and have minimal effect on the font colour.
4. Double pixel width fill ins. Look at the score board for digit contrast, fill in colour and colon between digits.
5. Clarity of DHGR text.
6. End tips of the blue water wave lines can show off the oval gaussian pixel spread typical of analog type signals (hence my preference of VGA over DVI/HDMI when available).


Digital models:



Every colour model takes its source from the monochrome data. The simplest model takes four pixels, determines the colour from the palette of sixteen and then outputs that colour as four pixels. That model is called the RGB model (displayed by some RGB expansion cards). It's very blocky and does not look anything like an Apple II output. This model helps us see that setting the correct pixel location is also important.



Generating a digital model can be broken down into two stages. The first stage is colour mapping the pixel locations where the monochrome pixels were "ON" or bright. The above diagram shows four different ways in which the source data can be read. Each way results in a slightly different output. The second stage involves filling in the gaps between the coloured pixels. Only gaps of one, two and three pixels need to be filled in. Again, there are multiple ways in which one can choose to fill in the colours.

The result is a model such as AppleWin - Color (Composite Idealized). This model has a very thin fringe of colour around white objects. I played around trying to increase the fringing colour but that just made things worse. It makes it look like the colour is part of the font instead of making it look like the colour is a faint glow around the font. This model is very sharp, but it comes at a price. Sometimes it creates anomalies like on the number 8 in the HGR example above or less than ideal sharp colours between the letters in the HGR example above.


Analog models:



Kris helped me with the understanding of how his ii-pix model worked. I converted the code to be used for the A2VGA cards. Playing around with the model parameters I got great results.

I then tried converting over the OpenEmulator code and I had success there as well. Due to the Pi Pico being limited in resources and only being able to run up to 133MHz ish, unlike other video solutions that are running in the GHz range and contain multiple more cores, there is obviously limitations with this model ie no barrel or shadow mask effect, limited integer scaling and 16bit colours. Even still, the results are outstanding.



Here is a good comparison of DHGR text. On the left we have the sharp AppleWin Color (Composite Idealized) and on the right we have the smooth but authentic OpenEmulator with a luma bandwidth set at 2.0M. If you find that the OpenEmulator picture is too smooth, then you may be accustomed into thinking that CRT displays were sharper and looked more like today's emulators. You can adjust the OpenEmulator output to be sharper by upping the luma bandwidth but at a value of 2.5M you will start to see the picture become stripy and by the value of 3.0M it is well defined. I've played around with the Chebyshev and Lanczos windows which OpenEmulator uses in its mathematical calculations to see how the output changes and I've also added a tweak to clean up solid lines.

A link to the video example of the OpenEmulator Model.https://drive.google.com/file/d/1MkWuSuV-nNTKfjn8OiW6IyQbo_N3P9YP

During a discussion a question was asked on how an analog model would look like on hardware that had a lower colour range. Specifically 9bit / RGB333. I did a simulation and it came up pretty well. Much better than I had expected. However, it did struggle with colours that were similar ie in Airheart the noticeable colours were red, orange and brown.


Using high definition (WUXGA):



The above chart shows how much difference there is on a 1:1 pixel translation colour spread vs a 1:3 pixel translation. However, since the OpenEmulator's output model is very smooth, using high definition does not improve the output on a standard size monitor. Where high definition may become advantageous is when trying to smooth out a digital model or where each pixel becomes large and blocky ie when viewing the image on a large screen. It's something I have not needed as much as I thought I would but it's there now in case I need it.

I'm not a fan of the 1920x1080 / 1080p / Full HD resolution. It doesn't allow for a nice integer scaling of the Apple II display. Therefore, I haven’t spent much time in getting this resolution to work.


Conclusion:

All my equipment is PAL based and the best composite output I have is from a IIc Modulator/Adapter. Hence, I'm not in the best position to be comparing results to actual hardware and fine tuning the results. If you are interested in Apple II colour models, below are a few good links to resource material.

OpenEmulator
https://observablehq.com/@zellyn/apple-ii-ntsc-emulation-openemulator-explainer
https://zellyn.github.io/apple2shader/
https://github.com/gabrieldiego/tg/blob/master/ref/Keith%20Jack%20-%20Video%20Demystified%20-%20A%20Handbook%20for%20the%20Digital%20Engineer%204e.pdf

ii-pix
https://www.youtube.com/watch?v=JpyGDqhcFlA
https://github.com/KrisKennaway/ii-pix

Stephen A. Edwards method
https://www.cs.columbia.edu/~sedwards/papers/edwards2009retrocomputing.pdf

AppleWin - Color (Composite Idealized)
https://lukazi.blogspot.com/2017/03/double-high-resolution-graphics-dhgr.html
https://lukazi.blogspot.com/2015/05/a2-video-streamer-colour.html

Other
https://nerdlypleasures.blogspot.com/2021/10/apple-ii-composite-artifact-color-ntsc.html
https://www.kansasfest.org/wp-content/uploads/2009-ferdinand-video.pdf
https://www.kreativekorp.com/miscpages/a2info/munafo.shtml
https://docs.google.com/spreadsheets/d/1rKR6A_bVniSCtIP_rrv8QLWJdj4h6jEU1jJj0AebWwg/edit#gid=0

Thanks goes to Kris for his help on the ii-pix conversion, Marc for implementing OpenEmulator and Zellyn for making the OpenEmulator code more accessible.

Who knows where this will go? I've been playing around with my own digital and analog models. To date, they have not been as impressive as the models detailed in this blog. To me the A2VGA is more than just an Apple II video card. It's a system that allows me to play around with various Apple II colour models. Now, all I need is more play time. I plan on covering how the A2VGA works and how all this was implemented in a session at the next OzKFest event in late October 2023.

Monday, July 17, 2023

A2VGA - Pico Cards - High Definition


A2VGA - Pico IIe/IIc cards are now capable of displaying HD/WUXGA resolutions ie 1920x1080 60Hz and 1920x1200 60Hz. All this while amazingly still running the Pi Pico at the video pixel clock frequency. The importance of this will be explained shortly.

What's the big deal with a HD (high definition) resolution I hear you ask? Well it has nothing to do with modern monitors or digital video. It's all to do with having a more detailed picture / a more realistic look. By using a VGA resolution (640x480), the Apple II display 560x384 (lines doubled to give a squarish look) fits in nicely to give you one LCD pixel for every Apple II bit/pixel. However, by using HD you get three LCD pixels for every Apple bit/pixel. Including the complementary 16bit (RGB565) colour range, I'm hoping this is going to make a big improvement to the picture display.

While waiting to finalise the hardware for the next revision of the A2VGA cards I worked on the firmware issues that I knew needed addressing. I also wondered how far the Pi Pico board could be pushed. I didn't know if it could do what I wanted it to do but I took the gamble anyway and it paid off. The Pi Pico has impressed me with its performance.

The first firmware revision worked by using the Raspberry Pi foundation's video driver. The Pi Pico does not have enough memory for a VGA display buffer (using RGB565). This would require a buffer size of 600kB but the Pi Pico only has 264kB of RAM. Therefore, a line buffer is used and each line is constructed on the go. This driver was not fast enough to process just one line at a time. 128 pre calculated lines were required just to make VGA work. This is a great driver and for certain applications it is ideal, however, because it is so generic it is only just capable of handling the Apple II video application in the VGA resolution. I went looking for other VGA drivers to use. There are plenty out there and they all look terrific at first but when you get down to the details they all have their limitations (only works in QVGA resolution, limited colour range, needs to be encoded into the data stream, needs to be overclocked by over 100% etc). This is great because they are all customised to their specific application, but for the Apple II video application they are unwanted road blocks. What caught my interest was the driver written by Hunter Adams. It was simple and fully contained within a single PIO block. A Pi Pico PIO block only contains 32 instructions and this VGA driver used up 31 of these instructions. An amazing feat to get it all in there. Even though the PIO was basically all used up I still went ahead and converted over to use this new driver. It was a great improvement for the A2VGA. It resulted in being able to speed up the data processing from 128 lines to under a single line. This gave me hope that a HD resolution was within reach. Actually, I knew HD was possible however all the examples I came across required the Pi Pico to be overclocked from its maximum speed of 133MHz to over double that ie around 300MHz. That might be fine for a demo, but it is not going to work for a reliable product. Going from a VGA 60Hz pixel clock (25.175MHz) where the Pi Pico runs at roughly 5 times the pixel clock ie 125MHz to HD 60Hz (148.5MHz) / WUXGA 60Hz (154.0MHz in reduced blanking mode) is a big jump and practically would only work if the Pi Pico was able to send out a pixel every clock cycle. That is a tall ask. But with the PIO fully loaded where was the extra processing to come from? The break through came from first having to optimise the VGA driver to buggery then having enough room to implement outputting the pixel every clock cycle. The system then became scalable. Up scalable to reach HD resolutions or down scalable to better match VGA frequencies for older monitors. Yes, it still requires overlocking but only just over 10%. From speculation the 133MHz limit was based on the flash speed but A2VGA runs the code from memory instead so I'm not concerned with the small amount of overclocking needed. The problem now becomes having to process the data in time and transfer it between buffers. By using two whole CPU cores the data processing now becomes borderline capable. It is still a work in progress since I need to find some more room for little things like extra video modes.

I don't have any demos to show off because currently the display is just the scaled image of the VGA driver which can be seen in my previous blog post. I have contacted a few people in getting some assistance with generating a finer colour model worthy of display. I'll include an example then. In the mean time I will be displaying the results at the upcoming Brisbane Apple II gathering (this weekend - 22/07/2023).

Saturday, May 6, 2023

A2VGA - Pico IIe Card







The A2VGA - Pico IIe is an Apple IIe specific VGA card for the budget conscious user. This card like many of the Apple II RGB video cards uses the raw video signals from the Apple IIe auxiliary slot. It's constructed from easy to obtain, through hole and relatively inexpensive components. These include the Pi Pico microcontroller, a level shifter, a diode, several resistors, sockets and some optional buttons. This is a product of my previous video investigations. Since I've had bugger all hobby time in quite a while I’ve only managed to get this knocked out because it’s been a small task. I guess you can say this is picking the low hanging fruit from the possible video cards I could have tackled.

Since the card uses the same signals which are available on the Apple IIc video port, The A2VGA - Pico IIc will be available at a later stage. This will contain the same firmware but will have a different form factor plus a 12V to 5V step down converter.

The card is super simple to build, super simple to use and super simple to update the firmware on. No special tools are needed for updating the firmware. Connect a USB cable from your PC to the Pi Pico and when a new drive pops up, you just copy to it the latest/desired firmware. Done. That's it.

The card can be used by itself if the Apple II only needs to run with 64kB. Extra memory needs to be plugged into the piggyback slot to run more memory hungry software and to use the DHGR video mode. The AIIE 80COL/64K MEMORY EXPANSION card works well as extra memory however, taller memory cards may have issues fitting under the lid. Once the memory card is inserted into the piggyback slot it is difficult to remove. I use a small screwdriver to gently lever the memory card out using the provided PCB holes.

I tested the card's display lag (see a previous post of mine if you are interested in how this is achieved) and on the Samsung SyncMaster 740N monitor I found that it was one frame behind the CRT monitor while on the DELL U2410 monitor it was two frames. I believe most of this delay comes from the monitor itself since the A2VGA does not have a video frame buffer as such. An internal A2 monochrome buffer is updated after every 32 bits that are read from the A2 bitstream. Independently the VGA generation part just processes and spits out a line at a time and performs this as fast as it can.

The card works by using the following video signals:

The required signals:
- 14M : Apple II video clock.
- SEROUT* : Luma part of the video signal ie the monochrome pixel bitstream.
- WNDW* : Start of line being displayed or SYNC* - Start of line at the left border. I provided a logic analyser screen shot showing the difference between WNDW and SYNC in my last video post.

Optional signals that are nice to have.
- GR : Sets up each line to display in colour or monochrome.
- LDPS : Not implemented on this project. Each Apple II display line is 560 pixels long (it does not matter what video mode is being used by the Apple II) however the starting location on the screen changes based on the video mode. For example, the 80 column text mode starts at a different location on the screen compared to 40 column text mode (four pixels of difference). LDPS can be used to align these video modes. Using a typical monitor, you get a black border around the displayed picture so most people will never notice this effect. I used this signal in my video streamer project because the display was in a 560 pixel window, meaning that without this signal I would have resorted to either chopping off one of the modes by 4 pixels or adding an unwanted vertical bar to the left or right side of the picture depending on which mode was selected.

How it works:
The input PIO section of the Pi Pico records the accessory bits (GR, LDPS if needed) and then 560 screen pixels into a 32bit aligned structure for each line. The monochrome A2 buffer needed is just over 14kB. This is a fraction of the 256k Pi Pico RAM available. The VGA generation code runs independently. Each VGA line is pre calculated based on a specific A2 colour model (AppleWin’s Composite Idealized) before being sent to the output PIO section. This section drives the VGA resistor DACs (Digital to Analog Converters) and Sync signals. The A2 display area 560x192 is displayed within the standard 640x480 resolution. The lines are doubled to produce a squarish look. When in scanline mode, every second line is blacked out making this appear as if the monitor contains scanlines. The colour model is a two-stage transformation. First, using a twelve-bit scrolling window, where the middle four bits are translated into four coloured pixels (each coloured pixel being 4bits ie the 16 Apple II colours). Then the second transformation converts each of these coloured pixels into the closest matched Apple II display colour using 16 bits (RGB565).

I’ve implemented the best looking colour model that I have at the moment. I’ve also turned on the scanline effect. I have not used any of the buttons yet. The board can output RGB565 however I got the project working using RGB555 since that was what the example project was configured for. I’m sure it is not going to make much difference, so I 'm not in a rush to fix it. I’ve still to test several options like the resistor values used for the DACs. I’ve used precise resistors ie 500k, 1k 2k, 4k, 8k and 16k however, resistors do not normally come in these values. They are obtainable but I would like to see how much of a difference it would make to the display if standard resistor values were used instead.

I’m looking at making some changes to the next PCB revision. These changes include:
1. Add an option for the VGA socket to be PCB mounted or externally connected via a cable.
2. Fix ribbon cable pinouts. Currently they are very close together making it horrible to wire on the cable.
3. Use 74LVC245 instead of the 4 line MOSFET level shifter. This will allow more signals to be added. I want to add Sync in case someone wants to rewrite my firmware using a different method. If you have any other signals you wish to add (which are available on both the Apple IIe aux slot and the Apple IIc video port) then let me know. I'll consider adding them as well.

Concerns
1. The NTSC Apple IIe aux port is in a different location to a PAL aux port. I don't have a NTSC IIe motherboard to test on. I’m estimating clearances by going of pictures of motherboards.
2. Electrical interference. VGA is analog so you may experience noise from electrical interference, especially if a ribbon cable is going to be used for routing the VGA plug to the rear of the case. I can see slight interference effects on the display when looking for them. This is on one of my monitors while the other monitor looks totally fine. I get a different effect when using the IIe vs using the IIc. For example, I tried to locate a small shimmer that I am seeing when using the Apple IIc. I have tracked this down to being caused by the IIc power supply on the ground line. Not sure what I can do to mitigate this. More testing is needed.

The VGA generation pretty much uses an entire core and one of the PIO modules. The Raspberry Pi SDK VGA generation code could be replaced by an alternative, much simpler model (there are a few of these around) if more processing power is required. The second PIO is partially used up with importing the A2 video stream. Currently, the second core is only being used by one interrupt which occurs at the end of each line (handles the A2 horizontal and vertical sync processing).

The Pi Pico runs much slower than then Raspberry Pi Zero so I found that the colour model needed more optimisation to make it work. I'm happy that I was able to get it running without having to overclock the processor. To obtain better colour models (where higher screen resolutions are required and larger colour ranges ie RGB888) I suspect something more powerful than Pi Pico is needed. If efficiency was of no concern and money no object then we could throw a 3GHz processor at the problem and have something running that looked like OpenEmulator. However, I believe that there is a middle ground where FPGAs or Pi Zero like devices could be used to generate great colour models.



I've been tesing on the IIc for the most part because it is much easier to access the card.

Project files will be provided once the final card design is worked out, a schematic generated and the code cleaned up.

Here is a video of the cards output. https://docs.google.com/open?id=1ZbibOvvSFw8lRugwCkTYdhJfkaTdboKr


Update: 6th June, 2023.


Display of 80 Column Text. Note: scan line emulation is turned on.


Colour Killer testing, Mixed Mode (monochrome and colour) testing - VIDD7 mask, Power regulator testing.

Wednesday, May 3, 2023

Apple IIGS to RGB Adapter


On my hobby desk I usually keep my 9inch Sony monitor sitting around as it does not take up a lot of room. It supports composite as well as RGB so I can quickly connect up any of my Apple II variants. For the IIGS I've been using a bogey custom cable that I knocked up a long time ago. It's always been a pain for the times when I need the cable to be just a fraction longer. So it's been on my to do list for a while to put together an adapter so that I can use any of my various length component RGB cables. I recently put in to get a bunch of PCB boards made up and took the opportunity to include a PCB for a IIGS video to RGB adapter. I've allowed for a spot to attach an extra RCA socket in case I ever have a monitor which requires an external Sync signal.



Friday, March 24, 2023

Native Two Joystick Gameport Support


Did you know that the Apple II computer contains native multiplayer non-keyboard support? Over the years I've documented this functionality being done with digital joysticks but since I've also been involved with native two joystick game enhancement projects, I figured it was a good time to document these as well.

All Apple II models contain a 16 pin or a 9 pin plug providing support for a single joystick. This means all you need is a paddle set like the following to play multiplayer paddle games such as Warlord or Star Thief. The first paddle uses the x-axis part of the joystick and the second paddle uses the y-axis part.



Multiplayer joystick games such as One-on-One, Archon I & II, DogFight and Superstar Ice Hockey (IIGS only) can also be played natively on the Apple II however an adapter is required to make this happen. This is because the Apple II only contains a single physical game port (16 pin or 9 pin). This needs the second joystick signals routed to a new second joystick port. Commercial adapters such as the Wico joystick adapter or Paddle-Adapple perform this function but I've seen modern incarnations of this adapter being sold. Alternately, you could build one of these yourself since the connections are very simple. Probably one of the simplest enhancement projects you could make for the Apple II. This will work on all Apple II models except the IIc / IIc+. On these models the second joystick signals were replaced with digital mouse signals.



The wiring convention was to have only one trigger button per joystick (PB0 for Joystick 1 and PB1 for Joystick 2). For my converter, I've added a switch allowing me to swap between single player (PB0 and PB1 maps to joystick 1) and dual player (PB0 and PB2 maps Joystick 1, PB1 and PB2 maps to Joystick 2). The extra button, PB2, allows for a common function say, starting or pausing a game.

Just recently Michael contacted me and asked for assistance with the conversion of the game Mario Bros to dual analog joystick support. I thought this was a great idea and because I've already done a few patches to this game, it did not take long to perform the conversion. I'm weary of the fact that dual analog joystick code, in computing terms, can take a long time to process. I suspect this is the reason why not many games support this functionality. However, Mario Bros only requires one axis (left/right movement) per player so this should not be any more taxing on computing resources than any other Apple II joystick-based game. It plays very well and is a lot of fun in co-op mode.

The other project that I was lucky to have been involved with (helping with testing) was Nick's conversion of the game Robotron: 2084. The original game is single payer but uses two joysticks, one for movement and one for shooting. The story is that its creator, Eugene Jarvis who also created Defender, had a broken arm at the time so couldn't use buttons. Nick couldn't understand why the author for the Apple II version hadn't added this mode into the game. So he tracked him down and quizzed him on various parts of the code. Nick disassembled the game, added the mod and added the author's name (Steve Hays) to the credits. He really deserved that. What I love more than playing this game on the Apple II is playing this game in AppleWin and using two thumbsticks on the Logitech controller. Nick said that adding the thumbstick support to AppleWin was a bit or work, but worth it for arguably the greatest classic shootem-up. I tend to agree. It's a fast paced game with lots of things going on at once. You're always just a fraction away from being annihilated. That is how it is meant to be played. It's on the edge of your seat type stuff and a spectical to watch when played well.

Here is the stand I setup at the 2017 OzKFest gathering for showing off the conversion.



Attached is a zip file containing the converted games, conversion documentation and schematics for the joystick adapter (both Michael's version and mine).https://docs.google.com/open?id=1xwcqHOAhXEmHleHvQRsnNdBCJY9tKehv
Here is also a link to Michael's You Tube video presentation where he discusses the joystick adapter. https://youtu.be/N24VPiIjzCA

Happy gaming.


Update: 10th October, 2023.


For the soldering session at OzKFest 2023 I created this simple board.

Saturday, February 26, 2022

Apple II video to HDMI using a Raspberry Pi Zero / Pico



Example output from an Apple II video to HDMI adapter that I have been toying with. Pictured is one of the many colour models. The dual colour mode represents how the Apple II's video is inherently monochrome and yet the use of artifact colour results in a decent colour depth for such a small storage footprint.

In September 2021, Rob, a fellow Apple II enthusiast posted some images of an RGBtoHDMI, a Raspberry Pi based HDMI video solution, being setup on his Apple II. It sparked my interest because back in 2015 I looked into using the Raspberry Pi as a HDMI video display solution. This was around the time I was working on the "A2 Video Streamer" project. I couldn't see how the Raspberry Pi could capture signals consistently and fast enough to process the A2 video stream so I implemented a solution using the BeagleBone Black instead. Even though my devices worked, neither of them were user friendly enough to warrant further development. I don't see these projects as failures but as learning opportunities and stepping stones to bigger and better solutions. The RGBtoHDMI has given me new ideas to investigate.

The RGBtoHDMI is an amazing project. (https://github.com/hoglet67/RGBtoHDMI/wiki) It began as a HDMI interface for the BBC Micro but because of its flexibility it has been modified to support many computer systems of the same era, including the Apple II and the Apple IIGS. It's made up of two main parts. One being the CPLD which handles the level shifting, digital or analog signal sampling and the bit shifting while the Raspberry Pi Zero handles the pixel clock and frame image generation. It generates the right frequencies that allow it to synchronise to a computer's video clock. However, this comes at a price. It pretty much maxes out the RPi Zero's CPU. Even though it can output Apple II video to HDMI, using this device for the Apple II, in my opinion, is not overly efficient (I'm talking about the TTL signals here and not the composite one). The Apple II provides its own pixel clock so the majority of RGBtoHDMI's awesomeness, in this situation, is not needed. Instead, I would prefer to use the RPi Zero's CPU resources to perform better video emulation. The source code for RGBtoHDMI is huge and complex. I wanted to take a more simplistic approach.
        


Raspberry Pi Zero being used for video output and a few of the many failed attempts at getting a stable picture.

I wanted to revisit the Raspberry Pi to see if it was capable of being a simple modern day Apple II video solution. Especially now with the tiny size of some Pi models. I started off by using the RPi Zero to sample Apple IIc video signals directly (after going through level shifting of course). To get the maximum amount of power available out of the RPi Zero, bare metal programming was used. Not true bare metal. The manufacturer's firmware is still being used to setup the communications and the video but then everything after that is bare metal. Even then, using the general GPIO interface, I wasn't able to obtain a stable picture. I tried running the program in C, in assembler (manually optimised), using DMA and even running the code on the Video Processing Unit (VPU) but with no luck. The Apple II pixel clock being just above 14MHz means that sampling needs to be performed at around the 30 mega samples per second mark. I was getting in the high 20s but the readings were not consistent. We have a situation where the Apple II is outputting the video stream at a consistent rate but the RPi Zero only wants to read the data when it feels like it. The RPi Zero is not a microcontroller. It's a computer and the GPIO is isolated from the CPU. This means that it is difficult to control high speed parallel data coming in. In most situations the RPi Zero will be the master and dictating when and how the data is being transferred. We don't have that luxury here so a new plan was needed.

Digging deeper into the RPi Zero, there is an interface which shows potential as a high speed parallel means of reading in data. That interface is the RPi's Secondary Memory Interface (SMI). There isn't a lot of documentation on the interface or example projects, especially when it comes to bare metal. However I did find enough to get me started and plenty to keep me experimenting. The worst part of this interface is that the bus is driven when it's not active. This is to protect memory chips from having floating pins, which is not good for them. However, we are not interfacing with memory chip but an Apple II video source instead. The consistent Apple II video stream doesn't fit in well with this interface. I needed to find something to isolate the two. At first I was concerned about the SMI shorting out the Apple II motherboard but I found a great solution. The 74HCT4066 chip worked quite well as an isolator. This is an analog switch which has a small amount of "on resistance" that allows this chip to be used as a level shifter. Also the chip's control lines were used to isolate the Apple II video signal when the SMI interface drove the data bus. I did manage to pull in data at speeds of 15 mega samples per second but for some reason it would not work at the higher speeds. At that point, my wired connections between the IIc and the devices were a mess and not as short and organised as they are shown in the photo above. I wasn't surprised I had issues. Frustrated with not being able to get anywhere with the project I took a break. Time was running out for the year and I wasn't going to lug my development machine with me on holidays. Any more work in this area was going to have to wait until the following year.



Raspberry Pi Pico and a HDMI breakout board being used for video output. Monochrome output.

What I did have with me was my laptop and the Raspberry Pi Pico which I wanted to try out. I was able to quickly and easily setup a development environment for the RPi Pico which allowed me to play around with video examples over the holidays. When I returned back home, it only took me a weekend to get the RPi Pico working as a video output device for the Apple IIc. The RPi Pico is a totally different beast compared to the RPi Zero. It's a microcontroller and is able to control the GPIO pins at the same rate as its system clock, which is rated to 133MHz. This A2 video solution is based on top of the PicoDVI project (https://github.com/Wren6991/picodvi). It uses a HDMI socket but the signals are actually DVI signals. These DVI signals are being bit banged out of the RPi Pico in a semi-compliant format. It's amazing that this actually works since DVI and HDMI are normally reserved for devices that run ten times the speed of the RPi Pico or have dedicated hardware. This solution has several drawbacks. The RPi Pico needs to be overclocked to 252MHz for a 640x480 60Hz picture and even more if you want higher resolutions. Also, due to not being fully DVI compliant means that working on a given monitor is just pot luck. I found that a monochrome picture worked well but I could not get any colour examples working properly on the holiday apartment's TV or my home DELL monitor (even after trying the recommended work arounds). I'm sure I could have found a monitor to make this work but I just didn't see this as a solution we could move forward with. That's not to say the RPi Pico is not useful. Using the RPi Pico as an Apple II to VGA adapter would be a better alternative. There are several projects around which demonstrate the RPi Pico's ability to generate VGA signals. The RP Pico's VGA output would need to be RGB565 (16bit colour) since there are not enough pins on the Pico to directly make an RGB888 (24bit colour) solution. Maybe you could but with more hardware. However, using a common Apple II colour palette (https://docs.google.com/spreadsheets/d/1rKR6A_bVniSCtIP_rrv8QLWJdj4h6jEU1jJj0AebWwg/edit#gid=1819819314) fits in perfectly into the RGB565 colours.

Taking it to the next level will require RGB888 (24bit colour / 16 million colours) and a lot of processing power for colour emulation. That's why I moved back to working with the RPi Zero. Again I tried my luck with the SMI interface and yet again I fell flat on my face. Getting SMI working with DMA and bringing in the data at a paced rate would be the ultimate goal. This would leave the RPi Zero totally free to do just one job ie colour emulation. To get a working solution in the meantime meant that I had to compromise. I figured I could get something operational now and then later work backwards to achieve the ultimate result.

Logic analyser output showing one line of an Apple II video signal. Things to note.
1. The clock (14M) is too fine to make out using this scale. Only after zooming in would you see a nice square wave.
2. The start of line can be determined by using the SYNC or WNDW signals.
3. The GR signal goes from high to low indicating that the previous line was a graphics line but the current line is part of the text section. The Apple II screen is in MIXED mode ie graphics and text.

Before leading into the working solution I just wanted to express how simple the Apple II video stream really is. There are three signals which give you most of what is needed for a video display. The most important of the signals is the serial data out (SEROUT) which carries the monochrome 560 pixels per line of the Apple II screen. All the Apple II video display modes are a subset of the 560 pixels. Then there is the clock (14M). Its frequency is just over 14 MHz. It does not matter that the frequency of the US version is slightly different to the PAL version because the only thing needed for processing is the clock's edge. The third signal is used to calculate the horizontal and vertical line synchronisation. The sync (SYNC) signal can be used for this if you want to start at the very left of the screen, including blank padding, or the window (WNDW) signal which starts just before the 560 Apple II pixels. The vertical sync can be determined by reading the specific sync pattern or just by measuring the time between sync pulses. Other less important signals can be used to make the display cleaner. For example the GR signal can be used to kill the colour for lines that are displaying text (this removes the colour and makes text look clear and sharp). The LDPS signal can be used to re-align different video modes. The number of line pixels is always 560 but some video modes start closer than others to the left hand border.


Raspberry Pi Pico and Pi Zero being used for video output. 


Example of different monochrome and colour modes. The colour mode shown here is what the AppleWin emulator calls "Composite Idealized". The last mode fades the picture in and out but obviously the processing power needed is not enough using the current setup.

A working video solution was generated by using a RPi Zero and a RPi Pico. The RPi Pico acts as a serial to parallel converter and a buffer which stores data until the RPi Zero is ready to read that data. This means the RPi Zero's GPIO does not have to operate at a high speed. I would have preferred to have used a hardware serial to parallel FIFO but I just couldn't find anything that was cheap and available in a DIP package. In its current configuration the RPi Pico is not even working up a sweat. Only one PIO module is operating which means that the other PIO module and the two CPU cores are not being used. An 8 bit bus was enough to get this device to work. I was considering going to 16 bits but then I would loose the serial port debugging pins. Also, the archaic RPi Zero GPIO pin numbering makes routing PCB tracks more difficult. So, less bus lines in this case may be better. The GPIO is being called directly and for now, it is not using the SMI interface. Eight pixel bits are being read and processed on the fly before being sent to the frame buffer. Since the bitstream always represents a monochrome picture, emulation is needed to convert this into colour information. I'm experimenting with several different colour models and pushing the limits to see how much processing power can be used before the video picture turns to shit. At the moment the data from the RPi Pico is being polled which is a very inefficient way of doing communications. Even before going to the ultimate DMA solution, I could rewrite the code to be interrupt or state driven to get more out of the system. There are several other simple patches that could be done to increase performance like adding a bigger buffer in the RPi Pico or adding buffering to the RPi Zero (this would allow for processing during the blanking times). However then you start to compromise between the video delay and the emulation quality. For now I'm just happy that it works.

The adapter can be developed into a package for the IIc using the IIc's unique video port, the IIe using the 60pin auxiliary port or for the older Apple II models using the fiddly method (that's the technical word for it) by using alligator clips and connecting directly to chip pin legs (like in a2heaven's "Apple II VGA Scaler" card). A lot of the code is currently hard coded since the aim was to concentrate on getting a working system. It's currently operating in the centre of a 640 x 480 frame running at 60Hz and using ARGB8888. The A2's horizontal resolution is doubled and the second line is illuminated or turned off depending on the scan line display setting. I've added two user buttons. One button toggles the different display modes and the second button toggles between the colour palettes of which I only have two setup. I love how user friendly this solution is. Plug it in and it's up within two seconds. Most of that time seems to be due to the monitor syncing up. I did consider doing an FPGA implementation instead. I purchased a Scarab FPGA with HDMI outputs a few years back but I never got around to doing anything with it. It wasn't exactly cheap and they don't look to be available anymore. There are cheaper FPGA alternatives around today like the "Tang Nano 4K" which looks promising (I have one on order). So, is this going to be the best Apple II video solution? Definitely not. I still prefer to use a CRT when I can. When a CRT is not available then I love the VidHD, especially as a IIgs solution. All I wish is that we can make a device with great colour emulation for the HDMI platform that is cheap, easy to make and is readily available. Making a VGA version may not be a bad idea either. As a DIY project I couldn't think of anything simpler. The bonus is that if you don't like any of my colour models then you'll have the option to re-program the device and display whatever suits you.

Video of the device going through its current display modes. They come in pairs. Every second one is the same as the one before except it is set to have 50% scan lines. https://docs.google.com/open?id=18Pt6o6ZSSOQmiXkR6-xle7y614uFe4Pn

Saturday, October 2, 2021

Game Controller - AppleWin - Added support for 4play and SNES MAX cards




AppleWin, the dominant Apple II emulation software for the Windows platform, from version 1.30.5.0 now supports the 4play and SNES MAX joystick cards. Thank-you to Tom and Nick (long time AppleWin developers) for helping me get this feature into the software.

Gamers now have the opportunity to play games such as "Attack of the PETSCII Rebots" which only supports the SNES MAX or a keyboard for user input, on an emulation platform with alternate controllers. A great option for those who do not have access to the physical joystick cards.

This also gives developers an easy way to help out with developing software for these cards. I use AppleWin quite a lot for debugging and this is going to save me a great deal of time. It will also help me with not having to lug around an Apple II development system on my family vacations.



In the AppleWin Configuration page the "Input" tab is used to set the availablity of the required card in either slot 3, 4 or 5. A "PC Controller" needs to be connected to the PC. AppleWin will automatically detect the "PC Controller" (this is independant to what is selected for Joystick1 and Joystick2 on the configuration page). Some game controllers have different mappings for their buttons so the most generic mapping has been setup as default but alternate mappings can be accessed via command line switches.

The latest version of AppleWin can be obtained from here :- https://github.com/AppleWin/AppleWin/releases