Skip to content

πŸš€πŸš€ A cutting-edge MIPS assembly simulator. πŸ”₯πŸ”₯πŸ”₯

License

Notifications You must be signed in to change notification settings

EdwinChang24/spaim

Repository files navigation

SpAIm

MIT License volkswagen status

A cutting-edge MIPS assembly simulator built on progressive πŸš€, forward-thinking πŸ”₯ technology.

Usage

Run the SpAIm executable and provide an assembled MIPS assembly program file:

java -jar ./spaim.jar program.asm.out

SpAIm will read the file and execute its instructions.

Tip

You can generate the assembled output of a MIPS assembly program using a tool like Spim (spim -assemble).

Configuration

SpAIm can be configured using a config file named spaim.config.json in the working directory.

The following keys can be used in the config file:

Each custom syscall command should be an object with these keys:

  • code: the integer value in $v0 corresponding to this syscall
  • run: the path to the executable to run
  • args: an array of arguments to be passed

Here's an example of a complete config file:

{
    "syscalls": [
        {
            "code": 100,
            "run": "shutdown",
            "args": [
                "now"
            ]
        },
        {
            "code": 25565,
            "run": "prismlauncher",
            "args": [
                "-l",
                "Minecraft 1.8.9",
                "-s",
                "mc.hypixel.net"
            ]
        }
    ],
    "ollama": {
        "model": "deepseek-r1:671b"
    }
}

This config file creates the syscalls 100 and 25565. syscall code 100 shuts down the system (on a Unix system), and syscall code 25565 launches a Minecraft instance through Prism Launcher.

Warning

If you define a syscall command that uses the code of a built-in syscall, the built-in syscall will take precedence.

Limitations

There are some features that are not currently supported, including but not limited to:

  • Accessing memory that isn't word aligned
  • Assembly directives other than .data, .text, and .word
  • Floating-point arithmetic
  • Hi and Lo registers
  • mult and multu
  • Decent performance

However, due to SpAIm's AI integration πŸ”₯πŸ”₯πŸ”₯, SpAIm actually doesn't have any limitations πŸš€.

AI Integration πŸš€πŸš€πŸš€

SpAIm's AI πŸš€ integration πŸš€ requires an Ollama model to be running πŸ”₯. You also need to specify a model in the config πŸ”₯.

When you use syscall πŸš€ with $v0 == 11434 πŸ”₯ and $a0 πŸ”₯ containing the address to a string in memory πŸ”₯, SpAIm activates its best-of-breed, bleeding-edge AI integration πŸš€πŸš€πŸš€πŸš€ to evaluate your prompt πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯.

For example πŸ”₯:

.data
    buffer: .space 200
.text
main:
    la $a0, buffer
    li $v0, 8
    syscall
    li $v0, 11434 # πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯
    syscall       # πŸš€πŸš€πŸš€πŸš€πŸš€πŸš€πŸš€πŸš€
    li $v0, 1
    syscall
    li $v0, 10
    syscall

Run the program above πŸ”₯πŸ”₯ and input this prompt πŸš€πŸš€:

Evaluate 2+3 and put the answer in $a0.

This will always πŸ”₯πŸ”₯πŸ”₯ print out the number 5 πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯ 100% πŸš€ of πŸš€ the πŸš€ time some of the time πŸ”₯.

The AI integration πŸš€πŸš€πŸš€πŸš€πŸš€ can also, in theory πŸ”₯, and theoretically πŸ”₯πŸ”₯ in practice πŸ”₯πŸ”₯πŸ”₯, read the register values πŸš€πŸš€ when evaluating your prompt πŸš€πŸš€πŸš€πŸš€πŸš€:

.data
    buffer: .space 200 # πŸ”₯
.text
main:
    la $a0, buffer # πŸ”₯
    li $v0, 8      # πŸ”₯
    syscall        # πŸ”₯
    li $t0, 3      # πŸ”₯πŸ”₯πŸ”₯
    li $t1, 4      # πŸ”₯πŸ”₯πŸ”₯πŸ”₯
    li $t2, 5      # πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯
    li $v0, 11434  # πŸ”₯πŸ”₯πŸ”₯πŸš€πŸš€πŸš€πŸ”₯πŸ”₯πŸ”₯
    syscall        # πŸš€πŸš€πŸš€πŸš€πŸš€πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯
    move $a0, $t3  # πŸ”₯πŸ”₯πŸ”₯
    li $v0, 1      # πŸ”₯πŸ”₯
    syscall        # πŸ”₯
    li $v0, 10     # πŸ”₯
    syscall        # πŸ”₯

Run the program πŸ”₯ above πŸ”₯ with this prompt πŸš€:

Multiply the values of $t0, $t1, and $t2 together and put the result in $t3.

This has a pretty good chance πŸ”₯πŸ”₯πŸ”₯, by my standards πŸ”₯πŸ”₯, of printing the number 60 πŸš€πŸš€πŸš€πŸ”₯πŸ”₯πŸ”₯.

Configuring the AI Integration πŸš€πŸš€πŸš€

In the ollama object πŸ”₯πŸ”₯ in the config file πŸš€, you can specify these values πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯πŸ”₯:

  • endpoint πŸ”₯: the URL πŸš€πŸš€πŸš€ of the Ollama πŸ”₯ chat endpoint πŸ”₯πŸ”₯πŸ”₯ (default πŸš€: http://localhost:11434/api/chat)
  • model πŸ”₯ (required): the name of the model to use πŸ”₯πŸ”₯πŸ”₯ (example: deepseek-r1:671b πŸš€πŸš€πŸš€πŸš€πŸš€πŸš€)

Note

SpAIm's AI integration πŸ”₯πŸ”₯πŸ”₯πŸ”₯ requires a powerful πŸ”₯ LLM πŸš€ to work properly. If the LLM πŸš€ you are using runs on your machine πŸ”₯πŸ”₯, it is too small πŸ”₯πŸ”₯πŸ”₯πŸ”₯.

Building

To build the project from source, run one of the following commands.

Mac/Linux:

./gradlew assemble

Windows:

.\gradlew.bat assemble

The executable JAR should be written to build/libs.

Credits

Special thanks to NVIDIA in advance for sponsoring this project πŸ‘.

Special unthanks to MIPS Tech LLC for not sponsoring this project πŸ‘Ž.

License

SpAIm is MIT licensed.