-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Generated from contents of master commit: 5fd8ae3
- Loading branch information
1 parent
dbce1fd
commit 6f5142b
Showing
352 changed files
with
5,881 additions
and
0 deletions.
There are no files selected for viewing
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
{"pageProps":{"listData":[{"linkUrl":"/code/abstract-map-simulator","mediaPosition":"center","mediaUrls":["/_next/static/images/abstract_map_simulation-55e32b58dd5e4ed9caf7a85baf98677c.png.webp","/_next/static/images/abstract_map_simulation-3a9dbfc04fa16e80a961cec841d316fc.png"],"primaryText":"2D Simulator for Zoo Experiments","secondaryText":"btalb/abstract_map_simulator","secondaryTransform":"lowercase"},{"linkUrl":"/code/abstract-map","mediaPosition":"center","mediaUrls":["/_next/static/images/abstract_map_in_action-51c5e1dcb68134fbb20baad53816b40f.png.webp","/_next/static/images/abstract_map_in_action-863c3403cb5be611fa8f5dcbdbb45c3f.png"],"primaryText":"Abstract Map (Python)","secondaryText":"btalb/abstract_map","secondaryTransform":"lowercase"},{"linkUrl":"/code/abstract-map-app","mediaPosition":"center","mediaUrls":["/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.webm","/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.mp4","/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.webp","/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.jpg"],"primaryText":"Android App for Human Participants","secondaryText":"btalb/abstract_map_app","secondaryTransform":"lowercase"},{"linkUrl":"/code/armer","mediaPosition":"center","mediaUrls":["/_next/static/images/armer_example-ff4e12b2ac663fa5fb394397d23d2681.webm","/_next/static/images/armer_example-ff4e12b2ac663fa5fb394397d23d2681.mp4","/_next/static/images/armer_example-ff4e12b2ac663fa5fb394397d23d2681.webp","/_next/static/images/armer_example-ff4e12b2ac663fa5fb394397d23d2681.jpg"],"primaryText":"Armer Driver","secondaryText":"qcr/armer","secondaryTransform":"lowercase"},{"linkUrl":"/code/ros-trees","mediaPosition":"center","mediaUrls":["/_next/static/images/frankie-d932493db407ac66026b9fb5968df6f2.webm","/_next/static/images/frankie-d932493db407ac66026b9fb5968df6f2.mp4","/_next/static/images/frankie-d932493db407ac66026b9fb5968df6f2.webp","/_next/static/images/frankie-d932493db407ac66026b9fb5968df6f2.jpg"],"primaryText":"Behaviour trees for ROS","secondaryText":"qcr/ros_trees","secondaryTransform":"lowercase"},{"linkUrl":"/code/benchbot-addons","mediaPosition":"center","mediaUrls":["/_next/static/images/benchbot_addons-39bf0e168760909371d48341ec57fbad.webm","/_next/static/images/benchbot_addons-39bf0e168760909371d48341ec57fbad.mp4","/_next/static/images/benchbot_addons-39bf0e168760909371d48341ec57fbad.webp","/_next/static/images/benchbot_addons-39bf0e168760909371d48341ec57fbad.jpg"],"primaryText":"BenchBot Add-ons Manager","secondaryText":"qcr/benchbot_addons","secondaryTransform":"lowercase"},{"linkUrl":"/code/benchbot-supervisor","mediaPosition":"center 0%","mediaUrls":["/_next/static/images/benchbot_supervisor-3e4092b6584962e3e4529101ae489a08.jpg.webp","/_next/static/images/benchbot_supervisor-fb509eb331f3380fbf5da2c3035116b6.jpg"],"primaryText":"BenchBot Backend Supervisor","secondaryText":"qcr/benchbot_supervisor","secondaryTransform":"lowercase"},{"linkUrl":"/code/benchbot-eval","mediaPosition":"center","mediaUrls":["/qcr_logo_light_filled.svg"],"primaryText":"BenchBot Evaluation Tools","secondaryText":"qcr/benchbot_eval","secondaryTransform":"lowercase"},{"linkUrl":"/code/benchbot-api","mediaPosition":"center 100%","mediaUrls":["/_next/static/images/benchbot_api_web-9d335e4f90cddac6fb91d546b1d6dc20.webm","/_next/static/images/benchbot_api_web-9d335e4f90cddac6fb91d546b1d6dc20.mp4","/_next/static/images/benchbot_api_web-9d335e4f90cddac6fb91d546b1d6dc20.webp","/_next/static/images/benchbot_api_web-9d335e4f90cddac6fb91d546b1d6dc20.jpg"],"primaryText":"BenchBot Python API","secondaryText":"qcr/benchbot_api","secondaryTransform":"lowercase"},{"linkUrl":"/code/benchbot-simulator","mediaPosition":"center","mediaUrls":["/_next/static/images/benchbot_simulator-0194a01dd7a2f34b0b6a9e53bd88dc2e.webm","/_next/static/images/benchbot_simulator-0194a01dd7a2f34b0b6a9e53bd88dc2e.mp4","/_next/static/images/benchbot_simulator-0194a01dd7a2f34b0b6a9e53bd88dc2e.webp","/_next/static/images/benchbot_simulator-0194a01dd7a2f34b0b6a9e53bd88dc2e.jpg"],"primaryText":"BenchBot Simulator (Isaac)","secondaryText":"qcr/benchbot_simulator","secondaryTransform":"lowercase"},{"linkUrl":"/code/benchbot","mediaPosition":"100% center","mediaUrls":["/_next/static/images/benchbot_web-0b702a0b6abfa7d90459998e6fa0ee8c.webm","/_next/static/images/benchbot_web-0b702a0b6abfa7d90459998e6fa0ee8c.mp4","/_next/static/images/benchbot_web-0b702a0b6abfa7d90459998e6fa0ee8c.webp","/_next/static/images/benchbot_web-0b702a0b6abfa7d90459998e6fa0ee8c.jpg"],"primaryText":"BenchBot Software Stack","secondaryText":"qcr/benchbot","secondaryTransform":"lowercase"},{"linkUrl":"/code/delta_descriptors_code","mediaPosition":"center","mediaUrls":["/_next/static/images/ral-iros-2020-delta-descriptors-schematic-b5f57732c327f2f8546715b5dc3643af.png.webp","/_next/static/images/ral-iros-2020-delta-descriptors-schematic-95f5d1a50f3d92aa3344d9782ac13c32.png"],"primaryText":"Delta Descriptors","secondaryText":"oravus/DeltaDescriptors","secondaryTransform":"lowercase"},{"linkUrl":"/code/pgraph-python","mediaPosition":"center","mediaUrls":["/_next/static/images/roads-8b68dd7b635af6f867a02be9d399b4bd.png.webp","/_next/static/images/roads-18739c10c6cf2a6dccbffb581fb9a183.png"],"primaryText":"Graph classes (Python)","secondaryText":"petercorke/pgraph-python","secondaryTransform":"lowercase"},{"linkUrl":"/code/gtsam-quadrics","mediaPosition":"center","mediaUrls":["/_next/static/images/gtsam_quadrics-9ce945399d611f449b8df8e1db6602ae.png.webp","/_next/static/images/gtsam_quadrics-cb27c37d5d64abed2e30e1523a8cec1a.png"],"primaryText":"GTSAM extension for quadrics","secondaryText":"qcr/gtsam-quadrics","secondaryTransform":"lowercase"},{"linkUrl":"/code/heaputil_code","mediaPosition":"center","mediaUrls":["/_next/static/images/overview-8c193585e23714439d55f0227d88f923.jpg.webp","/_next/static/images/overview-fc609d6102a3c08cb20b14382e57ee50.jpg"],"primaryText":"HEAPUtil","secondaryText":"Nik-V9/HEAPUtil","secondaryTransform":"lowercase"},{"linkUrl":"/code/lost_code","mediaPosition":"center","mediaUrls":["/_next/static/images/day-night-keypoint-correspondence-place-recognition-38203057bf036a1e9271b0a7647119fa.jpg.webp","/_next/static/images/day-night-keypoint-correspondence-place-recognition-bed6f778b7ec1ce4edaa346e24fb33bf.jpg"],"primaryText":"LoST-X","secondaryText":"oravus/lostX","secondaryTransform":"lowercase"},{"linkUrl":"/code/openseqslam2_code","mediaPosition":"center","mediaUrls":["/_next/static/images/openseqslam2-c5079d59d4cff5bd652acb1652d047f6.png.webp","/_next/static/images/openseqslam2-f3755fc8e61c0d81c8f0b0f42c5e08ae.png"],"primaryText":"OpenSeqSLAM2","secondaryText":"qcr/openseqslam2","secondaryTransform":"lowercase"},{"linkUrl":"/code/patchnetvlad_code","mediaPosition":"center","mediaUrls":["/_next/static/images/patch_netvlad_method_diagram-a9187148aad4ff631ce8f55f695459ec.png.webp","/_next/static/images/patch_netvlad_method_diagram-26dab363c927eaf0c0020decf330646e.png"],"primaryText":"Patch-NetVLAD","secondaryText":"QVPR/Patch-NetVLAD","secondaryTransform":"lowercase"},{"linkUrl":"/code/topometric_localization","mediaPosition":"center","mediaUrls":["/qcr_logo_light_filled.svg"],"primaryText":"Place-aware Topometric Localization","secondaryText":"mingu6/TopometricLoc","secondaryTransform":"lowercase"},{"linkUrl":"/code/pdq","mediaFit":"contain","mediaPosition":"center","mediaUrls":["/_next/static/images/qcr_web_img-c5a515adb03792ab295e52f405822b65.jpg.webp","/_next/static/images/qcr_web_img-8b73fea58e143ca4e51ab20579b08efa.jpg"],"primaryText":"Probability-based Detection Quality (PDQ)","secondaryText":"david2611/pdq_evaluation","secondaryTransform":"lowercase"},{"linkUrl":"/code/code-templates","mediaPosition":"center","mediaUrls":["/_next/static/images/demo-70a6816faa1e78bf2b6f4c8115a1a047.webm","/_next/static/images/demo-70a6816faa1e78bf2b6f4c8115a1a047.mp4","/_next/static/images/demo-70a6816faa1e78bf2b6f4c8115a1a047.webp","/_next/static/images/demo-70a6816faa1e78bf2b6f4c8115a1a047.jpg"],"primaryText":"QCR's Code Templates","secondaryText":"qcr/code_templates","secondaryTransform":"lowercase"},{"linkUrl":"/code/quadricslam","mediaPosition":"center","mediaUrls":["/_next/static/images/quadricslam_video-412d8ad8190b4f7eee1320faf254cd6f.png.webp","/_next/static/images/quadricslam_video-a4d673ea6414754e153004c137d2a2c1.png"],"primaryText":"QuadricSLAM","secondaryText":"qcr/quadricslam","secondaryTransform":"lowercase"},{"linkUrl":"/code/robotics-toolbox-python","mediaFit":"contain","mediaPosition":"center","mediaUrls":["/_next/static/images/RobToolBox_RoundLogoB-fd4fa9f238808ea84fa7ed15c039c58c.png.webp","/_next/static/images/RobToolBox_RoundLogoB-dd66a766d39b1761d4fba8db5bb28020.png"],"primaryText":"Robotics Toolbox Python","secondaryText":"petercorke/robotics-toolbox-python","secondaryTransform":"lowercase"},{"linkUrl":"/code/ros-omron-driver","mediaPosition":"center","mediaUrls":["/_next/static/images/omron_robot-6882a84f2dec840b5cba11e9f8f19e65.jpg.webp","/_next/static/images/omron_robot-542517e40cecf88333a4f6e07f854cc1.jpg"],"primaryText":"ROS Omron Driver","secondaryText":"qcr/ros_omron_driver","secondaryTransform":"lowercase"},{"linkUrl":"/code/rt_bene_code","mediaFit":"contain","mediaPosition":"center","mediaUrls":["/_next/static/images/rt_bene_best_poster_award-5ac70111852de9eac6c94cd88ef726e0.png.webp","/_next/static/images/rt_bene_best_poster_award-d72f84610eb0050287dd856b52cc99c5.png"],"primaryText":"RT-BENE: Real-Time Blink Estimation in Natural Environments Codebase","secondaryText":"Tobias-Fischer/rt_gene","secondaryTransform":"lowercase"},{"linkUrl":"/code/rt_gene_code","mediaFit":"contain","mediaPosition":"center","mediaUrls":["/_next/static/images/system_overview-e905413b7b8a569c769b893296ea5aa3.jpg.webp","/_next/static/images/system_overview-f550cd56b0872bdc54bc11c36db2eaf5.jpg"],"primaryText":"RT-GENE: Real-Time Eye Gaze Estimation in Natural Environments Codebase","secondaryText":"Tobias-Fischer/rt_gene","secondaryTransform":"lowercase"},{"linkUrl":"/code/seq2single_code","mediaPosition":"center","mediaUrls":["/_next/static/images/illustration-73bec1a3cac56819cdbea1268b711fa4.png.webp","/_next/static/images/illustration-1e185173132d7d8138449660ac905c04.png"],"primaryText":"seq2single","secondaryText":"oravus/seq2single","secondaryTransform":"lowercase"},{"linkUrl":"/code/seqnet_code","mediaPosition":"center","mediaUrls":["/_next/static/images/seqnet-cfc1aecd3cd2b268af41400a4fb86e6a.jpg.webp","/_next/static/images/seqnet-69de71978f2b7f0ffbcefcbb976010d3.jpg"],"primaryText":"SeqNet","secondaryText":"oravus/seqNet","secondaryTransform":"lowercase"},{"linkUrl":"/code/spatialmath-python","mediaFit":"contain","mediaPosition":"center","mediaUrls":["/_next/static/images/CartesianSnakes_LogoW-7d2f987ca5432e1ce32ce72e90be7c64.png.webp","/_next/static/images/CartesianSnakes_LogoW-d72d60a588449aa6a08846bed694c0c9.png"],"primaryText":"Spatialmath Python","secondaryText":"petercorke/spatialmath-python","secondaryTransform":"lowercase"},{"linkUrl":"/code/vpr_snn","mediaPosition":"center","mediaUrls":["/_next/static/images/Ens_of_modularSNNs-b59ff02969917c2eb544fd14a2014936.png.webp","/_next/static/images/Ens_of_modularSNNs-2e12118a078b9b819e6e9169d4994b74.png"],"primaryText":"Spiking Neural Networks for Visual Place Recognition","secondaryText":"QVPR/VPRSNN","secondaryTransform":"lowercase"},{"linkUrl":"/code/swift","mediaPosition":"center","mediaUrls":["/_next/static/images/panda-f1735ad2d702ae9c686b2f0e727e9941.png.webp","/_next/static/images/panda-c3722217e520e43c10f1bc26fffcd0fd.png"],"primaryText":"Swift","secondaryText":"jhavl/swift","secondaryTransform":"lowercase"},{"linkUrl":"/code/event_vpr_code","mediaPosition":"center","mediaUrls":["/_next/static/images/dataset-77ee27292f9a639c3024670f2a9939e2.png.webp","/_next/static/images/dataset-179d4dc0b9d40cbdc11117c78f1d45de.png"],"primaryText":"Visual Place Recognition using Event Cameras","secondaryText":"Tobias-Fischer/ensemble-event-vpr","secondaryTransform":"lowercase"},{"linkUrl":"/code/teach_repeat","mediaPosition":"center","mediaUrls":["/_next/static/images/outdoor-run-c6d0f9054f19ca3ca4a9c32ae5089b50.webm","/_next/static/images/outdoor-run-c6d0f9054f19ca3ca4a9c32ae5089b50.mp4","/_next/static/images/outdoor-run-c6d0f9054f19ca3ca4a9c32ae5089b50.webp","/_next/static/images/outdoor-run-c6d0f9054f19ca3ca4a9c32ae5089b50.jpg"],"primaryText":"Visual Teach and Repeat","secondaryText":"QVPR/teach-repeat","secondaryTransform":"lowercase"},{"linkUrl":"/code/vprbench","mediaPosition":"center","mediaUrls":["/_next/static/images/VPRBench-a4fbe919a2ac5fc851261353f3fbdd9a.jpg.webp","/_next/static/images/VPRBench-5db45a25afa26692b0958cbf579b9a77.jpg"],"primaryText":"VPR-Bench","secondaryText":"MubarizZaffar/VPR-Bench","secondaryTransform":"lowercase"}],"title":"Codebases on GitHub"},"__N_SSG":true} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
{"pageProps":{"codeData":{"content":"<p align=\"center\"><strong>~ Please see the <a href=\"https://btalb.github.io/abstract_map/\">abstract map site</a> for further details about the research publication ~</strong></p>\n<h1>App for the Human vs Abstract Map Zoo Experiments</h1>\n<p><video autoplay=\"\" muted=\"\" loop=\"\" poster=\"/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.webp\"><source src=\"/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.webm\" type=\"video/webm\"><source src=\"/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.mp4\" type=\"video/mp4\">A demo of the app human participants used</video></p>\n<p>This repository contains the mobile application used by human participants in the zoo experiments described in our <a href=\"https://doi.org/10.1109/TCDS.2020.2993855\">IEEE TCDS journal</a>. The app, created with Android Studio, includes the following:</p>\n<ul>\n<li>opening screen for users to select experiment name & goal location</li>\n<li>live display of the camera to help users correctly capture a tag</li>\n<li>instant visual feedback when a tag is detected, with colouring to denote whether symbolic spatial information is not the goal (red), navigation information (orange), or the goal (green)</li>\n<li>experiment definitions & tag mappings are creatable via the same XML style used in the <a href=\"https://github.com/btalb/abstract_map\">abstract_map</a> package</li>\n<li>integration with the <a href=\"https://github.com/AprilRobotics/apriltag\">native C AprilTags</a> using the Android NDK</li>\n</ul>\n<h2>Developing & producing the app</h2>\n<p>The project should be directly openable using Android Studio.</p>\n<p>Please keep in mind that this app was last developed in 2019, and Android Studio often introduces minor breaking changes with new versions. Often you will have to tweak things like Gradle versions / syntax etc. to get a project working with newer versions. Android Studio is very good though with pointing out where it sees errors and offering suggestions for how to resolve them.</p>\n<p>Once you have the project open, you should be able to compile the app and load it directly onto a device without issues.</p>\n<h2>Acknowledgements & Citing our work</h2>\n<p>This work was supported by the Australian Research Council's Discovery Projects Funding Scheme under Project DP140103216. The authors are with the <a href=\"https://research.qut.edu.au/qcr/\">QUT Centre for Robotics</a>.</p>\n<p>If you use this software in your research, or for comparisons, please kindly cite our work:</p>\n<pre><code>@ARTICLE{9091567, \n author={B. {Talbot} and F. {Dayoub} and P. {Corke} and G. {Wyeth}}, \n journal={IEEE Transactions on Cognitive and Developmental Systems}, \n title={Robot Navigation in Unseen Spaces using an Abstract Map}, \n year={2020}, \n volume={}, \n number={}, \n pages={1-1},\n keywords={Navigation;Robot sensing systems;Measurement;Linguistics;Visualization;symbol grounding;symbolic spatial information;abstract map;navigation;cognitive robotics;intelligent robots.},\n doi={10.1109/TCDS.2020.2993855},\n ISSN={2379-8939},\n month={},}\n}\n</code></pre>\n","name":"Android App for Human Participants","type":"code","url":"https://github.com/btalb/abstract_map_app","image":"./docs/abstract_map_app.gif","_images":["/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.webm","/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.mp4","/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.webp","/_next/static/images/abstract_map_app-d75baca0c5b7f59d88c7db6b1dff9e4d.jpg"],"src":"/content/human_cues/abstract-map-app.md","id":"abstract-map-app","image_position":"center"}},"__N_SSG":true} |
1 change: 1 addition & 0 deletions
1
_next/data/xiGQL6hq6_ZYPceHa4eHY/code/abstract-map-simulator.json
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
{"pageProps":{"codeData":{"content":"<p align=\"center\"><strong>~ Please see the <a href=\"https://btalb.github.io/abstract_map/\">abstract map site</a> for further details about the research publication ~</strong></p>\n<h1>Using the Abstract Map in a 2D Stage simulation</h1>\n<p><picture><img alt=\"2D Stage simulation with with simulated tags\" src=\"https:/github.com/btalb/abstract_map_simulator/raw/HEAD/docs/abstract_map_simulation.png\"></picture></p>\n<p>Package contains everything needed to simulate the zoo experiments performed in our <a href=\"https://doi.org/10.1109/TCDS.2020.2993855\">IEEE TCDS journal</a>. The package includes:</p>\n<ul>\n<li>world & launch files for a stage simulation of the GP-S11 environment on QUT's Gardens Point campus</li>\n<li>a tool for creating simulated tags in an environment & saving them to file,</li>\n<li>launch & config files for using the move_base navigation stack with gmapping to explore unseen simulated environments</li>\n</ul>\n<h2>Installing the abstract map simulator</h2>\n<p><em>Note: this is just the simulator; to use the abstract map with the simulator please make sure you use the <a href=\"https://github.com/btalb/abstract_map\">abstract_map</a> package</em></p>\n<p>Clone the repo & install all Python dependencies:</p>\n<pre><code>git clone https://github.com/btalb/abstract_map_simulator\npip install -r abstract_map_simulator/requirements.txt\n</code></pre>\n<p>Add the new package to your ROS workspace at <code><ROS_WS>/</code> by linking in the cloned repository:</p>\n<pre><code>ln -s <LOCATION_REPO_WAS_CLONED_ABOVE> <ROS_WS>/src/\n</code></pre>\n<p>Install all of the listed ROS dependencies, and build the package:</p>\n<pre><code>cd <ROS_WS>/src/\nrosdep install abstract_map_simulator\ncd <ROS_WS>\ncatkin_make\n</code></pre>\n<h2>Acknowledgements & Citing our work</h2>\n<p>This work was supported by the Australian Research Council's Discovery Projects Funding Scheme under Project DP140103216. The authors are with the <a href=\"https://research.qut.edu.au/qcr/\">QUT Centre for Robotics</a>.</p>\n<p>If you use this software in your research, or for comparisons, please kindly cite our work:</p>\n<pre><code>@ARTICLE{9091567, \n author={B. {Talbot} and F. {Dayoub} and P. {Corke} and G. {Wyeth}}, \n journal={IEEE Transactions on Cognitive and Developmental Systems}, \n title={Robot Navigation in Unseen Spaces using an Abstract Map}, \n year={2020}, \n volume={}, \n number={}, \n pages={1-1},\n keywords={Navigation;Robot sensing systems;Measurement;Linguistics;Visualization;symbol grounding;symbolic spatial information;abstract map;navigation;cognitive robotics;intelligent robots.},\n doi={10.1109/TCDS.2020.2993855},\n ISSN={2379-8939},\n month={},}\n}\n</code></pre>\n","name":"2D Simulator for Zoo Experiments","type":"code","url":"https://github.com/btalb/abstract_map_simulator","image":"./docs/abstract_map_simulation.png","_images":["/_next/static/images/abstract_map_simulation-55e32b58dd5e4ed9caf7a85baf98677c.png.webp","/_next/static/images/abstract_map_simulation-3a9dbfc04fa16e80a961cec841d316fc.png"],"src":"/content/human_cues/abstract-map-simulator.md","id":"abstract-map-simulator","image_position":"center"}},"__N_SSG":true} |
Oops, something went wrong.