Reimagine what's possible with MATLAB
GMT +8 | Singapore time
Designing Intelligent Robots from Land to Air to Sea
Learn & Network with Us
Fred NotoIndustry Manager MathWorks Inc.
Fred is a Robotics and Autonomous Systems Industry Manager at MathWorks Japan where he works with customers to realize solutions for complex and advanced robotics systems development. Prior to joining MathWorks, Fred worked at Northrop Grumman in Los Angeles, and as a Guidance, Navigation & Controls Engineer, he gained experience working on autonomous algorithms and controls systems for ground and aerial autonomous systems.
Ian AlferezApplication Engineering Manager TechSource Systems
Ian is an Application Engineering Manager at TechSource Systems. He specializes in in the field of embedded systems, data analytics and technical computing with MATLAB/Simulink. He has successfully provided technical supports to customers in Southeast Asia on Robotics and AI projects with MATLAB/ Simulink for over 9 years.
Koh Jun JieApplication Engineer TechSource Systems
Jun Jie is a Customer Success Engineer at TechSource Systems. He has experience on building a UAV tricopter and subsequently used MATLAB and Simulink for image and signal processing. He has keen interests in exploring industries on Robotics Automation and UAV applications.
Claudine Casipit PapagApplication Engineer TechSource Systems
Claudine is an Application Engineer at TechSource Systems. She specialized in modeling and simulation, Verification, Code generation, and tool automation with MATLAB/Simulink/Stateflow and cross software configuration. Exposed with various sensor fusion and navigation application, leveraging her expertise in the field of autonomous system with ROS interface.
Nestor FernandezApplication Engineer TechSource Systems
Nestor is an Application Engineer at TechSource Systems primarily for Technical Computation. He has worked with multiple clients on automation and deep learning particularly in computer vision. Among models he has developed is a defect detection algorithm using thermal images from a drone.
- Session 1 June 16 (Closed)
- Session 2 June 23 (Closed)
- Session 3 June 30 (Closed)
- Session 4 July 5 (Closed)
- Session 5 July 7 (Closed)
- Session 6 July 19 (Closed)
- Session 7 July 21 (Closed)
- Show All
Getting Started with ROS and Integrating with MATLAB & Simulink
In this session, we will introduce the workflows for connecting MATLAB/Simulink and ROS using the ROS Toolbox™. You will learn software features, templates, and best practices to help you implement common ROS programming constructs in MATLAB and Simulink. You will also learn how to combine MATLAB, Simulink, and State flow as modeling tools for different types of algorithms within a Simulink block diagram.
In the end, we will show a demonstration of the concepts above through an autonomous object tracking example. This example implements best practices with MATLAB and Robotics System Toolbox. It also highlights the modularity of MATLAB and ROS by showing the algorithm using real and simulated TurtleBot® robotic platforms, as well as a webcam. Two new products were introduced in 2 to complement the capabilities of Robotics System Toolbox: Navigation Toolbox and ROS Toolbox.
Automated ROS and ROS 2 Node Generation from Prototyping to Production
MATLAB and Simulink support a live connection with ROS network for your ROS based desktop simulation. In addition, you can generate stand-alone C++ nodes from MATLAB and Simulink through automated code generation. Starting with R2021b, it also supports embedding a deep learning network into ROS nodes using GPU coder (e.g., generate CUDA code of object detection algorithm from MATLAB Script using GPU Coder).
This webinar discusses automated C++ ROS / ROS2 node generation from MATLAB scripts and Simulink models. We will also highlight ROS Toolbox feature updates and propose the workflow that successful users have adopted.
- ROS and ROS 2 node generation and deployment from MATLAB scripts
- ROS and ROS 2 node generation and deployment from Simulink models
- ROS-compatible CUDA node generation from Simulink using GPU coder
- Working with generated ROS nodes
AI for Robotics
Increasing automation in robotics leads to a rise in applications of advanced AI models. Machine learning and Deep learning are finding more and more uses in Robotics in areas such as sensor fusion, perception, planning and control. Developing robust and accurate deep learning models that need to make precise and fast decisions are becoming increasingly more important in the industry. This webinar will highlight the tools in MATLAB used for developing deep learning models in use for robotics. We will go through topics such as an GUI-based workflow for labelling LiDAR or video data with automated labelling. Repurposing pre-built models with minimal coding in images. Developing Reinforcement Learning models for autonomous robots. And deploying MATLAB code into hardware or simulations.
- GUI based workflow for Labelling Data
- Developing models based on pre-trained models
- Maximizing efficiency using GPU accelerated hardware
- Deployment of trained model for simulation and prototyping
Future Vision for Smart Factories
AI is transforming engineering in nearly every industry and application area. Smart factories leverage AI for various applications such as predictive maintenance and quality inspection. Identifying product defects and reducing manufacturing errors can help reduce labor and manufacturing costs significantly.
- Trending towards Smart Factories
- Collaborative opportunities
- Model Based Design for autonomous systems development
Developing UAV Applications with MATLAB and Simulink
Unmanned Aerial Vehicle (UAV) usage is increasing across a variety of commercial applications such as delivery, agriculture, construction, and others. Many UAV applications incorporate technologies for autonomous operation and developing autonomous UAV systems and applications require multi-domain technologies and need simulation to ensure proper operations prior to testing on real hardware to reduce risk and rework. Model-Based Design provides an integrated workflow that allow engineers and researchers to develop UAV systems and applications, evaluate through virtual testing, and deploy to UAV hardware.
- Modeling methodologies for UAV systems
- Designing autonomous UAV algorithms
- Simulating autonomous UAV applications with sensor models
- Automating verification and validation tasks to ensure system robustness
- Automatically generating software and deploying to autopilot and onboard computer hardware
- Analyzing post-flight telemetry data
Fusing Sensor Data to Improve Perception in Autonomous Systems
Autonomous systems continue to be developed and deployed at an accelerated pace across a wide range of applications including automated driving, robotics, and UAVs. The four main components of an autonomous system are sensing, perception, decision making, and action execution and control.
In this webinar, we will focus on the perception component of an autonomous system. We will demonstrate a sensor fusion development framework that includes an integrated test bed, algorithms, and metrics for performance analysis. We will look at the strengths and weaknesses of different sensor modalities for localization and multi-object tracking. We will also explore how measurements from different sensor modalities such as radar and lidar can be combined to improve the quality of the perception system.
Multiple examples will be used to highlight concepts that are critical to architecting and integrating systems that perform sensor fusion.
You will be able to learn:
- Why sensor fusion and tracking are a key part of perception systems
- How to combine measurements that are taken from different sensor to improve pose estimates and situational awareness
- How to explore system level trade-offs including sensor accuracy, location, and update rates
Modeling and Simulation of Autonomous Mobile Robot Algorithms
Robotics deals with modelling and controlling mobile robots with the aim of high precision implementation. This is the reason why designing and navigational algorithm development, early testing, and target hardware deployment often than not becomes a challenge.
In this session, will be sharing ways to address this by showing how MATLAB and Simulink can help you mitigate object detection, data processing, and control creation to implement an integrated AMR design workflow.
- Generating occupancy maps and simulation environment from images
- Leveraging out-of-the-box tools to rapidly prototype navigation algorithms
- Integrating external simulators by utilizing a ROS interface
Do you have burning questions on Robotics & Autonomous Systems or MATLAB? We will answer you one-by-one live!
Explore how Robotics & Autonomous Systems are Transforming Engineering
Discover the new trends on Robotics Programming, Mobile Robots, UAV, Sensor Fusion, Smart Factories & Autonomous Systems
Featured Live Demos & Case Studies
Gain knowledge on how to develop robotics & autonomous applications from perception to motion and optimize system-level behavior
Find Solutions for your Robotics & Autonomous Systems Challenges
Learn as we address common challenges by showing how MATLAB can help to implement an integrated design workflow