Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add prompting example to examples/ #13

Merged
merged 1 commit into from
Sep 20, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
349 changes: 349 additions & 0 deletions examples/prompting.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,349 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"\"\"\"\n",
"This notebook is designed to showcase how to build your own subnet, modeled after the \n",
"Bittensor Text Prompting subnet. This subnet is designed to take in a list of messages\n",
"and a list of roles, and generate a completion based on the current state of the network.\n",
"\n",
"To create a subnet, you must define three (3) main elements: \n",
"- the protocol message format (synapse subclass)\n",
"- the miner (how to complete requests)\n",
"- the validator (how to validate requests)\n",
"\n",
"Below is a simple implementation of the protocol, so you can see how it works.\n",
"\n",
"#NOTE: You MUST implement \"{field}_hash\" for any required field in your synapse.\n",
"\n",
"For example, if you have a `messages` field, there must also be a complementary\n",
"`messages_hash` field. This is used to ensure that the data is not tampered with\n",
"during transit. You can use the `required_hash_fields` property to specify which\n",
"fields are required for hash computation.\n",
"\n",
"You can also use the `deserialize` method to specify how to deserialize the data\n",
"into a tensor. This is useful for converting strings into tensors, for example.\n",
"\n",
"You must also define your priority and blacklist functions. See below:\n",
"\"\"\"\n",
"\n",
"\n",
"import bittensor as bt\n",
"import pydantic\n",
"import time\n",
"import torch\n",
"bt.trace()\n",
"\n",
"from typing import List, Dict, Optional, Tuple, Union, Callable\n",
"# Subnet creator controls\n",
"class Prompting( bt.Synapse ):\n",
" \"\"\"\n",
" Represents the core component of a Synapse for handling and controlling message prompting in a network.\n",
" \n",
" The Prompting synapse captures roles and messages, and can generate a completion based on the current state. \n",
" It also contains unique hashes for roles and messages to ensure integrity and uniqueness.\n",
" \n",
" Attributes:\n",
" roles (List[str]): A list of roles associated with this synapse. Immutable post-instantiation.\n",
" messages (List[str]): A list of messages associated with this synapse. Immutable post-instantiation.\n",
" completion (str): A field to store the completion or result after processing.\n",
" messages_hash (str): An immutable hash representing the messages in the prompting scenario.\n",
" roles_hash (str): An immutable hash representing the roles in the prompting scenario.\n",
" \"\"\"\n",
" class Config: validate_assignment = True\n",
" def deserialize( self ): return self.completion\n",
"\n",
" roles: List[str] = pydantic.Field(..., allow_mutation=False)\n",
" messages: List[str] = pydantic.Field(..., allow_mutation=False)\n",
" completion: str = None\n",
"\n",
" messages_hash: str = pydantic.Field(\n",
" \"\",\n",
" title=\"Messages Hash\",\n",
" description=\"Hash of the messages in the Prompting scenario. Immutable.\",\n",
" allow_mutation=False,\n",
" )\n",
"\n",
" roles_hash: str = pydantic.Field(\n",
" \"\",\n",
" title=\"Roles Hash\",\n",
" description=\"Hash of the roles in the Prompting scenario. Immutable.\",\n",
" allow_mutation=False,\n",
" )\n",
"\n",
" @property\n",
" def required_hash_fields(self) -> List[str]:\n",
" \"\"\" Returns the list of fields that are essential for hash computation. \"\"\"\n",
" return ['messages']\n",
"\n",
"def prompt( synapse: Prompting ) -> Prompting:\n",
" \"\"\"\n",
" Process the provided synapse to generate a completion.\n",
"\n",
" Args:\n",
" synapse (Prompting): The input synapse to be processed.\n",
"\n",
" Returns:\n",
" Prompting: The updated synapse with a completion.\n",
" \"\"\"\n",
" bt.logging.debug(\"In prompt!\")\n",
" synapse.completion = \"I am a chatbot\"\n",
" return synapse\n",
"\n",
"def blacklist( synapse: Prompting ) -> Tuple[bool, str]:\n",
" \"\"\"\n",
" Determines if the provided synapse should be blacklisted.\n",
"\n",
" Args:\n",
" synapse (Prompting): The input synapse to be evaluated.\n",
"\n",
" Returns:\n",
" Tuple[bool, str]: A tuple containing a boolean that indicates whether the synapse is blacklisted,\n",
" and a string providing the reason.\n",
" \"\"\"\n",
" return False, \"\"\n",
"\n",
"def priority( synapse: Prompting ) -> float:\n",
" \"\"\"\n",
" Determines the priority of the provided synapse.\n",
"\n",
" Args:\n",
" synapse (Prompting): The input synapse to be evaluated.\n",
"\n",
" Returns:\n",
" float: The priority value of the synapse, with higher values indicating higher priority.\n",
" \"\"\"\n",
" return 0.0\n"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO: Started server process [3088652]\n",
"INFO: Waiting for application startup.\n",
"TRACE: ASGI [1] Started scope={'type': 'lifespan', 'asgi': {'version': '3.0', 'spec_version': '2.0'}, 'state': {}}\n",
"TRACE: ASGI [1] Receive {'type': 'lifespan.startup'}\n",
"TRACE: ASGI [1] Send {'type': 'lifespan.startup.complete'}\n",
"INFO: Application startup complete.\n",
"INFO: Uvicorn running on http://0.0.0.0:8099 (Press CTRL+C to quit)\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.962\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Pre-process synapse for request\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - HTTP connection made\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.962\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | dendrite | --> | 3837 B | Prompting | 5C86aJ2uQawR6P6veaJQXNK9HaWh6NMbUhTiLs65kq4ZW3NH | 216.153.62.113:8099 | 0 | Success\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - ASGI [2] Started scope={'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('127.0.0.1', 8099), 'client': ('127.0.0.1', 43048), 'scheme': 'http', 'method': 'POST', 'root_path': '', 'path': '/Prompting', 'raw_path': b'/Prompting', 'query_string': b'', 'headers': '<...>', 'state': {}}\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.978\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | request.headers Headers({'host': '0.0.0.0:8099', 'name': 'Prompting', 'timeout': '12.0', 'bt_header_axon_ip': '216.153.62.113', 'bt_header_axon_port': '8099', 'bt_header_axon_hotkey': '5C86aJ2uQawR6P6veaJQXNK9HaWh6NMbUhTiLs65kq4ZW3NH', 'bt_header_dendrite_ip': '216.153.62.113', 'bt_header_dendrite_version': '600', 'bt_header_dendrite_nonce': '2965315063849355', 'bt_header_dendrite_uuid': 'eea38ac4-5691-11ee-8416-56c4f4c1dac3', 'bt_header_dendrite_hotkey': '5C86aJ2uQawR6P6veaJQXNK9HaWh6NMbUhTiLs65kq4ZW3NH', 'bt_header_dendrite_signature': '0x18e44fcd52477c8c5bdb86a8255198dd83d78ceb97160194e7116945696b8f29e05bb129aef65ac0c0a9c86d36710b9fb18570f38ec2a097e515db70f96f3b8d', 'bt_header_input_obj_roles': 'W10=', 'bt_header_input_hash_roles': '70462836afeca889fed38157ba64b3ae1298b87f08270130a9c33704fc901c58', 'bt_header_input_obj_messages': 'W10=', 'bt_header_input_hash_messages': '13559dc1af7acbfb7f134eca00c24bbb0893b83c1199badde4da7e4b7421cb79', 'header_size': '640', 'total_size': '3865', 'accept': '*/*', 'accept-encoding': 'gzip, deflate', 'user-agent': 'Python/3.10 aiohttp/3.8.5', 'content-length': '852', 'content-type': 'application/json'})\n",
"\u001b[34m2023-09-19 02:12:06.978\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | axon | <-- | 852 B | Prompting | 5C86aJ2uQawR6P6veaJQXNK9HaWh6NMbUhTiLs65kq4ZW3NH | 127.0.0.1:43048 | 200 | Success \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - ASGI [2] Receive {'type': 'http.request', 'body': '<852 bytes>', 'more_body': False}\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.979\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Check Blacklist \n",
"\u001b[34m2023-09-19 02:12:06.979\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Run priority \n",
"\u001b[34m2023-09-19 02:12:06.981\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Check verification \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - ASGI [2] Send {'type': 'http.response.start', 'status': 200, 'headers': '<...>'}\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"INFO: 127.0.0.1:43048 - \"POST /Prompting HTTP/1.1\" 200 OK\n",
"\u001b[34m2023-09-19 02:12:06.981\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | loading body to json \n",
"\u001b[34m2023-09-19 02:12:06.982\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | Checking messages \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - ASGI [2] Send {'type': 'http.response.body', 'body': '<805 bytes>', 'more_body': True}\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.982\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Run forward \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - ASGI [2] Send {'type': 'http.response.body', 'body': '<0 bytes>', 'more_body': False}\n"
]
},
{
"data": {
"text/plain": [
"[Prompting(roles=['user'], messages=['hello, who are you?'], completion='I am a chatbot', messages_hash='', roles_hash='')]"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.984\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | In prompt! \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - ASGI [2] Receive {'type': 'http.disconnect'}\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.987\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Fill successful response \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - ASGI [2] Completed\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.992\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Finally \n",
"\u001b[34m2023-09-19 02:12:06.993\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | axon | --> | 805 B | Prompting | 5C86aJ2uQawR6P6veaJQXNK9HaWh6NMbUhTiLs65kq4ZW3NH | 127.0.0.1:43048 | 200 | Success\n",
"\u001b[34m2023-09-19 02:12:06.995\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Non-streaming response detected.\n",
"\u001b[34m2023-09-19 02:12:06.997\u001b[0m | \u001b[36m\u001b[1m TRACE \u001b[0m | Postprocess server response \n",
"\u001b[34m2023-09-19 02:12:06.998\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | dendrite | <-- | 4391 B | Prompting | 5C86aJ2uQawR6P6veaJQXNK9HaWh6NMbUhTiLs65kq4ZW3NH | 216.153.62.113:8099 | 200 | Success\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[34m2023-09-19 02:12:06.998\u001b[0m | \u001b[34m\u001b[1m DEBUG \u001b[0m | dendrite | <-- | 4391 B | Prompting | 5C86aJ2uQawR6P6veaJQXNK9HaWh6NMbUhTiLs65kq4ZW3NH | 216.153.62.113:8099 | 200 | Success\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"TRACE: 127.0.0.1:43048 - HTTP connection lost\n"
]
}
],
"source": [
"# Create an Axon instance on port 8099.\n",
"axon = bt.axon(port=8099)\n",
"\n",
"# Attach the forward, blacklist, and priority functions to the Axon.\n",
"# forward_fn: The function to handle forwarding logic.\n",
"# blacklist_fn: The function to determine if a request should be blacklisted.\n",
"# priority_fn: The function to determine the priority of the request.\n",
"axon.attach(\n",
" forward_fn=prompt,\n",
" blacklist_fn=blacklist,\n",
" priority_fn=priority\n",
")\n",
"\n",
"# Start the Axon to begin listening for requests.\n",
"axon.start()\n",
"\n",
"# Create a Dendrite instance to handle client-side communication.\n",
"d = bt.dendrite()\n",
"\n",
"# Send a request to the Axon using the Dendrite, passing in a StreamPrompting instance with roles and messages.\n",
"# The response is awaited, as the Dendrite communicates asynchronously with the Axon.\n",
"resp = await d(\n",
" [axon],\n",
" Prompting(roles=[\"user\"], messages=[\"hello, who are you?\"]),\n",
" deserialize=False\n",
")\n",
"\n",
"# The response object contains the result of the prompting operation.\n",
"resp"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "rev2",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
},
"orig_nbformat": 4
},
"nbformat": 4,
"nbformat_minor": 2
}