There has been a surge of enthusiasm in recent months, with several people expressing interest in working for Cosmic Horizon. If I were to rank the team members according to level of effort to date, no one would argue that I'm still in first place (e.g. all of the commits so far have come from me), but second place clearly belongs to Dr. Bimal Mishra, an expert in applied mathematics with interest in science and technology. I first introduced you to him in 2007 as J. Peterman. Dr. Mishra didn't do much with Cosmic Horizon for a while, but has since significantly intensified his study of the system. His initial contribution may be in the area of Apache Derby. Cosmic Horizon applications currently access Derby databases using the Embedded Derby JDBC driver. Sooner or later, we are likely to move to the more familiar client/server mode. Dr. Mishra may take us there sooner. Wherever he chooses to contribute, Dr. Mishra's efforts are greatly appreciated.

Now, let me walk you through the progress since my last blog item (Glorified Clerical Workers?) on our primary work.

On 2008-12-02, I had RTPG understanding a simple test case template, which would select either a test case exercising the SPARC-V9 ADD instruction, or MULX.

Five months earlier, I had last worked with a program called RandomOperands. At that time, I was satisfied to have that program produce just one pair of random multiplication operands, where each operand was positive and the pair would not overflow the 64-bit result that MULX produces. The algorithm was destined for RTPG, so the next thing I did was to refactor RandomOperands to employ a reusable class.

The next steps would be: (1) adjust the generation algorithm to get a reasonable runtime, (2) continue work on RTPG, including removal of expected value (a reference model responsibility) from the test cases database.

In my article, Development of a Microprocessor Verification Environment in Java, the Constrained Random Generation section covers what I would write next here in my blog, and at the appropriate level of detail, so I refer you to that. Five months earlier, I had used the linear curve to produce the single pair of operands that I needed. Now, I was working with the sigmoid function and beginning to adjust its shape. In order to generate thousands of test programs in a reasonable amount of time, it was necessary to speed up the algorithm quite a bit.

After 2009-01-05 and until 2009-02-25, nothing appears in my notes because that's when I was writing the above article.

Once the article was published, I spent a few days taking measurements and adjusting the shape of the sigmoid function until I was happy.

Then it was time to take the aforementioned reusable class, and reuse it in RTPG.

On 2009-03-01, I wrote a list of tasks that I wanted to complete before FSS Version_0-007 release:

1.Upgrade Apache Derby on jesus.
2.Remove expectedSum column.
3.Identify pairID for each existing pair that satisfies multiplication constraints.
4.Create MULXTestCases table.
5.Add row to MULXTestCases table for each pairID identified above.
6.Add rows to MULXTestCases table to total 2840 rows.
7.Add rows to ADDTestCases table for each new pairID in operands table.
8.Run automated verification on ADDTestCases.
9.Run automated verification on MULXTestCases, adding test cases as necessary until workload is 8 hours.

Referring to Specification for Providing a Test Cases Database, that list will make more sense.

Particularly relevant as I do this part of the work are the following sections from Comprehensive Functional Verification:

Creating Environments
Test Bench Writing Tools
C/C++ Libraries
Test Bench Building Block Objects: Params
High-Level Verification Languages
A Flavor of OpenVera
A Flavor of e
A Flavor of SystemC
Strategies for Simulation-Based Stimulus Generation
Strategies for Stimulus Generation
Applying the Four Types of Stimulus Generation to Calc2
Pregenerated Random Test Cases
Constraint Solving in Random Environments
Making Rare Events Occur


I have reached Step 7 of my FSS Version_0-007 release list, where on 2009-04-07 I was able to dump all of the new operand pairs. That will work well with the evolving test case template language:

<template_line> ::= <mnemonic> <operands> <EOL>
<mnemonic> ::= "ADD" | "MULX"
<operands> ::= <operand> <operand>
<operand> ::= "rand" | <number>

To complete Step 7, I intend to create a Perl program that provides one test case template for each operand pair that I just dumped. For each template, the Perl program will call RTPG and tell it to generate one test case. Each template gives both numbers (i.e. directed) and does not use "rand".

Step 9 will be the most difficult remaining step before FSS Version_0-007 release. One thing that needs to be done is to begin development of the SPARC-V9 Standard Reference Model. I forced this upon myself by removing expected results columns from the test cases database (Step 8, a regression step, is therefore currently broken too). For Version_0-007, I will keep the reference model as simple as possible, but it will finally exist. Also, the checking component needs to be made more intelligent. It must be taught to tolerate unknown latency. While it is true that, from one SPARC-V9 implementation to another, integer multiplication latency will generally differ, the more immediate concern is with Sputnik. It employs a multiplication algorithm whose latency depends on the operands.