-
Notifications
You must be signed in to change notification settings - Fork 383
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PC-98 GDC and LIO drawing support #3508
Conversation
… scrolling lines (port 0x76).
Fixed int 18h ah=42h and ah=4dh to be the same as the actual PC-98x1 behavior.
@nanshiki |
Just my two cents... I suppose masking the CRTC rows by 0x0F is correct for the 200-line modes, but according to a real PC-9821 laptop I test with, it seems to respond to the low 5 bits, not 4 bits. In fact, the DOS console modes where there are 20 scanlines per row instead of 16, usually to fit 25 rows in 480 scanlines, would be impossible if that were not a 5 bit wide register. Note that the CG still only supports 16 rows per char bitmap, so this is done by programming the hardware to start 2 scanlines down, then for each row, emit the 16 display scanlines followed by 4 blank lines, such that for each 20-scanline row the 16-scanline are centered. What does masking by 0x0F fix? Perhaps legacy PC-98 behavior is somewhat like MCGA where if emitting 200-line modes, the low 4 bits of row height are used, and if emitting VGA-compatible 400-line modes, the low 5 bits of row height are used. |
I think I found a case where your GDC draw support needs a little work. Windows 3.1 uses the GDC draw commands too. The latest commit seems to have also broken some Windows 3.1 drawing functions. Notice the black border around the tiles in Paintbrush. command_000.avi.mp4 |
I'm going to modify the GDC draw code where it concerns memory access. mem_readb() and mem_writeb() are CPU-side memory I/O, don't use them from hardware access, please. Your VGA card cannot cause the CPU to read/write memory and neither should the GDC. I just went over this with @Wengier, and for the same reason, the graphical issues and possible instabilities are resolved by commenting out any part of GDC draw that calls mem_readb()/mem_writeb(). This isn't an issue for PC-98 games because they either run in real mode or they run in virtual 8086 mode where the video memory isn't virtualized in any way. It's protected mode environments like Windows 3.1 through ME that issuing mem_readb()/mem_writeb() like that can cause problems. |
Final question: I get the impression that the GDC functions entirely on it's own from the EGC hardware. That would mean that perhaps when the GDC draws on planar memory it does so directly to the planar memory, not by reading/writing through the EGC. Right? |
I checked on an actual PC-9821, and in 640x400 mode, the scrolling once returns to 0 at port 76h = 10h, and after 11h, it seems to shift one by one. However, after 10h, the character display of the first line becomes strange. The software is "joshikousei-no-nazo-time-slip-toujankyou-maboroshi-ware-neo-kobe-pc98-ia" from the DOSBox game compat testing chart. I did not understand about mem_readb()/mem_writeb(). Please correct the code. |
@nanshiki I will test that game. I am adding a function to vga_memory.cpp to give your GDC draw code a more direct interface to the read/write functions of PC-98 VRAM than through mem_readb/mem_writeb. I will work on the code assuming that GDC read/modify/write cycles (meaning the GDC issues read/write I/O) go through EGC the same as CPU I/O. I'll restore your masking of that register by 0x0F in that case. When I first wrote that code I assumed that since other registers were 5 bits that maybe the register in question was 5 bits. On another note, the LIO BIOS needs to separately set GDC mode AND modify that byte in the BIOS data area. The GDC is a hardware device that isn't supposed to issue CPU memory I/O. I'll separate that GDC state from the LIO modification of the state in the BIOS data area. I'm going to start adding #defines to the VGA draw code and other code to make sure the CPU-side memory read/write functions cannot be used in code that's supposed to reflect hardware outside the CPU, the same technique used by FFMPEG to prevent itself from using fprint/printf. |
As for port 76h, I think it is valid for 5 bits because the scroll position changes up to 13h in 20-line mode (character 80x20).
This still works slightly differently from the actual machine when the number of scanlines is exceeded. Please correct the GDC and LIO. |
|
So perhaps it should be treated as 5 bits that are sign extended? |
@nanshiki Can you look at Balance of Power? I know it uses LIO BIOS functions but it still doesn't look right. It doesn't seem to clear bitplanes correctly. |
@nanshiki I have corrected the GDC draw to use a more direct interface to the PC-98 video RAM, without going through the CPU, and I have corrected the LIO BIOS to take responsibility for the draw state in the BIOS data area instead of the GDC. As far as I can tell, it hasn't changed how your code works with games. The only remaining mystery is what Windows 3.1 is trying to do when drawing the XOR line while you click and drag in Paintbrush. |
@nanshiki Speaking of using EGC to enhance the GDC line draw, I just noticed that I can make 99.99999999% of the line drawing artifacts disappear in Windows 3.1 if I modify the REPLACE/SET case of draw_dot() only write, but not read from, the video memory. Which means Windows 3.1 sets the GDC draw pattern to 0xFF and the EGC to a XOR raster op. I'm also noticing that the GDC_CMD_MODE_*: cases are documented to then accept data bytes that, in graphics modes, set the pattern to either 0xFFFF or 0x0000 based on the LSB. The pattern is 0xFFFF, therefore the line draw is effectively XORing the entire 8 bits and that's why the black blocks around the white line. I think I can fix that without breaking any games. |
Please try the latest code. I don't think I've broken anything your code fixes. I've added a hack to the GDC drawing code that does dummy reads when EGC is in use which is necessary to completely eliminate line drawing artifacts in Windows 3.1. As for the LIO BIOS functions, your code has enabled working display and graphics in many games of the msdos test set that were previously marked as not displaying anything. 👍 |
@nanshiki Can you make another pull request adding your changes to the CHANGELOG so you can be credited properly for your work in the next release? |
@joncampbell123 Thanks for the correction. Unfortunately, I do not have Balance of Power in my possession. I have created a test program for port 76h. |
@joncampbell123 My English is poor, so can you please correct the CHANGELOG there? |
@nanshiki What would you like me to add to the CHANGELOG? |
I will leave the contents to you. I have tried drawing with EGC by GDC and there seems to be a problem. When executed, the program first draws with GDC only, and when the key is pressed, it draws with GDC using EGC. |
@nanshiki The GDC according to this datasheet is said to draw using read/modify/write memory cycles. Prior discussion says that those apparently goes through the EGC. http://hackipedia.org/browse.cgi/Computer/Platform/PC%2c%20NEC%20PC%2d98/Video/NEC%20High%2dPerformance%20Graphics%20Display%20Controller%207220/%c2%b5PD7220%2c%20%c2%b5PD7220%2d1%2c%20%c2%b5PD7220%2d2%20Graphics%20Display%20Controller%2epdf However R/M/W with EGC seemed problematic with Windows 3.1 until I added a hack to issue a dummy read (result ignored) then to write the pattern. It only worked if the R/M/W acted as if the read got back 0x00. |
@nanshiki I see nothing in LINE.C that changes anything about EGC. Where does it write to EGC registers? As far as I can tell, it uses non-EGC drawing twice. And yet the compiled executable is switching on EGC drawing for the second draw. LINE.C doesn't seem to quite represent what LINE.EXE actually does. |
@joncampbell123 EGC drawing in LINE.C uses the GRCG tile register, which is set via outp(0x7c,0xc0) in linee() to simultaneously write multiple planes to a color line with a single GDC call. This source is taken from the EGC speedup sample source in the GDC Technical Book (ISBN: 4890520902). EGC is upward compatible with GRCG. GRCG could only be used for writing from the CPU, but EGC supports writing from the GDC. See the following (in Japanese) |
@joncampbell123 Balance of Power PC-9801 version, but the screen images I found in my search seem to be in monochrome. |
@nanshiki Though reading through Google Translate, I think that website you linked to is saying that the EGC has a GRCG compatible mode and EGC enhanced mode, and the GRCG mode is available to both the CPU and the GDC when either one does memory I/O. It does not say that the GDC has access to enhanced EGC functions. However, the behavior of Windows 3.1 suggests that maybe some enhanced EGC functions are possible with GDC line drawing. I checked using debug logging with GDC drawing state and Windows 3.1 uses the REPLACE mode of WDAT commands and the EGC raster op for XOR, not the COMPLEMENT mode as you'd expect. GDC memory I/O is said to be read/modify/write, however if I do that with Windows 3.1 artifacts occur because the read registers the white background and XORs it with the white background producing the black vertical strips as seen in the video I posted sometime back. This complicated behavior will have to be determined precisely by writing more test code in DOSLIB and running it on real PC-9821 hardware, which is also an opportunity to examine your other test program regarding the text scroll region TEST.EXE. I suppose DOSBox-X could make more progress at this point adding the monochrome mode you mentioned. What does the monochrome mode do? Render the first bitplane's bits as if all 3 or 4 planar bits? Clearly Balance of Power is using other bitplanes, so I wonder if that's the case after mapping through the 8-color digital palette. |
Basic test with DEBUG.EXE on a real laptop: The 8-color "digital" mode is active. Switching on monochrome mode makes bitplane 2 the only displayed bitplane, as if bitplane 2 becomes all three planar bits. I used DEBUG.EXE to poke some bytes into planar graphics. It helps to remember that bitplanes from B800 to A800 are G R B order. In 16-color modes memory appears at E000 and is the "E" plane to provide the 4th bit. VID_20220525_203224745.mp4.mp4 |
Which bitplane becomes the one monochrome bitplane is affected by the 8-color digital palette registers, confirmed. Which is probably how Balance of Power probably expects to manage the display and possibly flip between bitplanes. |
If you switch on monochrome mode and 16-color analog mode, there are some strange results. It seems to only show white only if BITPLANE2 == 1 && BITPLANE3 == 0. Furthermore the normal 16-color analog palette has no effect. It renders as if the 8-color digital palette were still in effect. However perhaps the monochrome mode was never intended for anything but 8-bit digital mode. |
The aforementioned book also states, "EGC has a compatibility mode with GRCG, and even allows it to be used from GDC." and it does not say that EGC's ROPs, etc. can be used. As for the monochrome mode, we will try various things here as well. |
@nanshiki I think monochrome mode is simple enough. It's for 8-color digital graphics mode. Bitplane 2 becomes the only displayed bitplane, after palette remapping through A8h-AEh even. It acts strangely in 16-color analog graphics mode, but probably because you're not supposed to do that :) |
If there is another register that controls which bitplane is THE bitplane in monochrome mode, that would be good to know. |
Monochrome mode, but it is not an operation that can be handled by the palette. It is strange that it is called monochrome mode but colors are applied, but this seems to be the same behavior as the 640x400 of the PC-8801 (8-bit machine before the PC-9801). Like the simple graphics mode, this feature is for compatibility with past models, and its use was not recommended. |
What I mean is that, at least on the PC-9821 laptop I have (a 486), monochrome mode seems to function like it gets a 3-bit color from the 3 bitplanes, then it remaps it through the 8-color digital palette, then after mapping, bit 2 determines whether the pixel is white or black. So it is likely possible to change which bitplane is the monochrome mode or even to map specific palette indexes to black or white this way. I will test your program. |
Balance of Power does not appear to use monochrome mode. The log would say so if it saw unknown port 68h write 02h/03h. However the log does say "LIO GCIRCLE not support flags: 05". |
@nanshiki Just to confirm, are you suggesting that the color of the monochrome graphics display plane is determined by the color attribute of the text layer at that point on screen? So that would mean that despite two separate genlocked GDCs in the system, the text and graphics output work together in monochrome mode to determine if the pixel is set or not and what color? When I was testing on the laptop the entire text layer was white on black, which would be the reason the graphics were the same, correct? By the way, bitplane 2 at B800 represents green, and I am aware that traditionally in RGB hookups that carry monochrome, the green pin is usually used to carry that signal, which is probably why that plane was chosen. I think MDA does the same thing while using the same 9-pin connectors as CGA in the IBM PC world. I am away from the test unit at this time and I cannot confirm on real hardware until later tonight. |
@joncampbell123 MONO.EXE, but it seems to display the same as the actual device. It is wonderful.
If the display still does not work with Balance of power, you may need to add a process to change the palette according to the display screen selection in the parameter ds:[bx+3] of GSCREEN in LIO. |
According to the GSCREEN code it is indicating a preference for mono, but then it calls INT 18h AH=42h with CH=C0h to effectively cancel it out. I am using the copy available on The Internet Archive "Neo Kobe" collection: |
@joncampbell123 Neko Project also has the same symptom without BIOS.ROM, so I still think it is a problem with the monochrome mode setting in GSCREEN in LIO. |
GDC and LIO drawing are implemented based on the Neko project II.
If the GDC clock is 5 MHz, DGD must be set to 1 in VECTW. However, this drawing process does not refer to the PITCH value, so if the screen settings pass normally, DGD can draw normally even if DGD is 0.
GDC command interpretation time and drawing time are not taken into consideration.
LIO adds GCIRCLE ellipse and fill and GPAINT1/GPAINT2 functions.
#930 I could not get KATALITH, but MAZE_999 and HELLO2 can be displayed.
#2421 FILMTN now displays correctly.
Other bug fixes.
int 18h, ah=42h does not change the read/write page of VRAM.
#692 Corrected the valid range of the setting value for the number of text scrolling lines (port 0x76). Some applications expect the set value 0x10 to display the same as 0x00.
Bit7 of memory 0000:054D is changed by int 18h ah=4dh. In PC-9821, Bit7=0 by default. int 18h ah=4dh ch=01h will make Bit7=1. This is confirmed to be the case on an actual PC-9821.
Fixed incorrect reading of odd addresses when EGC port 04A4h Bit13=0.