Skip to content

[ICRA 2024] AGRNav: Efficient and Energy-Saving Autonomous Navigation for Air-Ground Robots in Occlusion-Prone Environments

Notifications You must be signed in to change notification settings

jmwang0117/AGRNav

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🤖 AGRNav: Efficient and Energy-Saving Autonomous Navigation for Air-Ground Robots in Occlusion-Prone Environments

📢 News

  • [2024/01]: AGRNav is accepted to ICRA 2024.
  • [2023/11]: The code for training SCONet is in another repository.
  • [2023/09]: The 3D model in the simulation environment can be downloaded in OneDrive.
  • [2023/08]: 🔥 We released the code of AGRNav in the simulation environment.

If you find this work helpful, kindly show your support by giving us a free ⭐️. Your recognition is truly valued.

If you find this work useful in your research, please consider citing:

@INPROCEEDINGS{wang2024agrnav,
  author={Wang, Junming and Sun, Zekai and Guan, Xiuxian and Shen, Tianxiang and Zhang, Zongyuan and Duan, Tianyang and Huang, Dong and Zhao, Shixiong and Cui, Heming},
  booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)}, 
  title={AGRNav: Efficient and Energy-Saving Autonomous Navigation for Air-Ground Robots in Occlusion-Prone Environments}, 
  year={2024},
  volume={},
  number={},
  pages={11133-11139}
}

🛠️ Installation

The code was tested with python=3.6.9, as well as pytorch=1.10.0+cu111 and torchvision=0.11.2+cu111.

Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.

  1. Clone the repository locally:
 git clone https://github.com/jmwang0117/AGRNav.git
  1. We recommend using Docker to run the project, which can reduce the burden of configuring the environment, you can find the Dockerfile in our project, and then execute the following command:
 docker build . -t skywalker_robot -f Dockerfile
  1. After the compilation is complete, use our one-click startup script in the same directory:
 bash create_container.sh
  1. Next enter the container and use git clone our project
 docker exec -it robot bash
  1. Re-clone the repository locally
 git clone https://github.com/jmwang0117/AGRNav.git
  1. Since need to temporarily save the point cloud, please check the path in the following file:
/root/AGRNav/src/perception/launch/inference.launch

/root/AGRNav/src/perception/SCONet/network/data/SemanticKITTI.py

/root/AGRNav/src/perception/script/pointcloud_listener.py
  1. SCONet pre-trained model is in the folder below:
/root/AGRNav/src/perception/SCONet/network/weights
  1. If you want to use our 3D AGR model, please download the AGR model to the folder below:
/root/AGRNav/src/uav_simulator/Utils/odom_visualization/meshes

And modify the code on line 503 in the following file to AGR.dae

/root/AGRNav/src/uav_simulator/Utils/odom_visualization/src/odom_visualization.cpp
  1. Run the following commands
catkin_make
source devel/setup.bash
sh src/run.sh

You've begun this project successfully; enjoy yourself!

💽 Dataset

  • SemanticKITTI

🏆 Acknowledgement

Many thanks to these excellent open source projects:

About

[ICRA 2024] AGRNav: Efficient and Energy-Saving Autonomous Navigation for Air-Ground Robots in Occlusion-Prone Environments

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published