This file contains only basic metadata and links to where the website can be accessed. As a result, the licence under which this file is shared on OpenAIR is not necessarily the same as the licence used for the website content itself. Please consult the terms and conditions of use for the website directly. GENERAL INFORMATION 1. Title of Dataset: Video for pick&place study in AFRC project. [Dataset] 2. Contributor information: Yijun Yan (University of Strathclyde) Jaime Zabalza (University of Strathclyde) Carmelo Mineo (University of Strathclyde) Erfu Yang (University of Strathclyde) Zixiang Fei (University of Strathclyde) Cuebong Wong (University of Strathclyde) 3. Date of data collection: 2018-03-31 4. Funding: University of Strathclyde [Advanced Forming Research Centre (AFRC) under its Route to Impact Funding Programme 2017-2018] ACCESS INFORMATION 1. License: https://creativecommons.org/licenses/by/4.0/ 2. Links to publications that cite or use the data: https://doi.org/10.3390/s19061354 https://tinyurl.com/y5phtv58 3. Recommended citation: YAN, Y., ZABALZA, J., MINEO, C., YANG, E., FEI, Z. and WANG, C. 2019. Video for pick&place study in AFRC project. [Dataset]. Hosted on Pureportal (University Strathclyde) [online]. Available from: https://doi.org/10.15129/4df22803-2cc4-4cce-9cea-509f88f1b504 DATA AND FILE OVERVIEW 1. File List: Pick_place1(MP4) - Video clip shows the robot adapting to the sudden appearance of the obstacle (16 seconds). Pick_ place2(MP4) - Video clip shows the robot adapting to the sudden appearance of the obstacle (13 seconds). METHODOLOGICAL INFORMATION 1. Summary: The video data was created for the demo of Pick&place study in AFRC (Advanced Forming Research Centre) project. It quite clearly shows the robot adapting to the sudden appearance of the obstacle. 2. Detailed methodology: In order to extend the abilities of current robots in industrial applications towards more autonomous and flexible manufacturing, this work presents an integrated system comprising real-time sensing, path-planning and control of industrial robots to provide them with adaptive reasoning, autonomous thinking and environment interaction under dynamic and challenging conditions. The developed system consists of an intelligent motion planner for a 6 degrees-of-freedom robotic manipulator, which performs pick-and-place tasks according to an optimized path computed in real-time while avoiding a moving obstacle in the workspace. This moving obstacle is tracked by a sensing strategy based on machine vision, working on the HSV space for color detection in order to deal with changing conditions including non-uniform background, lighting reflections and shadows projection. The proposed machine vision is implemented by an off-board scheme with two low-cost cameras, where the second camera is aimed at solving the problem of vision obstruction when the robot invades the field of view of the main sensor. Real-time performance of the overall system has been experimentally tested, using a KUKA KR90 R3100 robot. A machine vision module based on low-cost cameras and color detection in the hue, saturation, value (HSV) space is developed to make the robot aware of its changing environment. Therefore, this vision allows the detection and localization of a randomly moving obstacle. Path correction to avoid collision avoidance for such obstacles with robotic manipulator is achieved by exploiting an adaptive path planning module along with a dedicated robot control module, where the three modules run simultaneously. These sensing/smart capabilities allow the smooth interactions between the robot and its dynamic environment, where the robot needs to react to dynamic changes through autonomous thinking and reasoning with the reaction times below the average human reaction time. The experimental results demonstrate that effective human-robot and robot-robot interactions can be realized through the innovative integration of emerging sensing techniques, efficient planning algorithms and systematic designs.