振搖式紅棗采收設(shè)備設(shè)計(jì)【振搖式紅棗收獲機(jī)采摘機(jī)設(shè)計(jì)】【含17張CAD圖紙+PDF圖】
喜歡就充值下載吧。資源目錄里展示的文件全都有,請(qǐng)放心下載,有疑問(wèn)咨詢QQ:414951605或者1304139763 = 喜歡就充值下載吧。資源目錄里展示的文件全都有,請(qǐng)放心下載,有疑問(wèn)咨詢QQ:414951605或者1304139763 =
Biosystems Engineering (2003) 86(2), 135144doi:10.1016/S1537-5110(03)00133-8Available online at AEAutomation and Emerging TechnologiesCollision-free Motion Planning for a Cucumber Picking RobotE.J. Van Henten; J. Hemming; B.A.J. Van Tuijl; J.G. Kornet; J. BontsemaDepartment of Greenhouse Engineering, Institute of Agricultural and Environmental Engineering (IMAG B.V.), P.O. Box 43, NL-6700 AAWageningen, The Netherlands; e-mail of corresponding author: eldert.vanhentenwur.nl(Received 26 April 2002; accepted in revised form 8 July 2003; published online 29 August 2003)One of the most challenging aspects of the development, at the Institute of Agricultural and EnvironmentalEngineering (IMAG B.V.), of an automatic harvesting machine for cucumbers was to achieve a fast andaccurate eyehand co-ordination during the picking operation. This paper presents a procedure developed forthe cucumber harvesting robot to pursue this objective. The procedure contains two main components. Firstof all acquisition of sensory information about the working environment of the robot and, secondly, aprogram to generate collision-free manipulator motions to direct the end-effector to and from the cucumber.This paper elaborates on the latter. Collision-free manipulator motions were generated with a so-called pathsearch algorithm. In this research the A*-search algorithm was used. With some numerical examples the searchprocedure is illustrated and analysed in view of application to cucumber harvesting. It is concluded thatcollision-free motions can be calculated for the seven degrees-of-freedom manipulator used in the cucumberpicking device. The A*-search algorithm is easy to implement and robust. This algorithm either produces asolution or stops when a solution cannot be found. This favourable property, however, makes the algorithmprohibitively slow. The results showed that the algorithm does not include much intelligence in the searchprocedure. It is concluded that to meet the required 10s for a single harvest cycle, further research is needed tofind fast algorithms that produce solutions using as much information about the particular structure of theproblem as possible and give a clear message if such a solution can not be found.# 2003 Silsoe Research Institute. All rights reservedPublished by Elsevier Ltd1. IntroductionIn 1996, the Institute of Agricultural and Environ-mental Engineering (IMAG B.V.) began research on thedevelopment of an autonomous cucumber harvestingrobot supported by the Dutch Ministry of Agriculture,Food and Fishery (Gieling et al., 1996; Van Kollenburg-Crisan et al., 1997). The task of designing robots foragricultural applications raises issues not encountered inother industries. The robot has to operate in a highlyunstructured environment in which no two scenes arethe same. Both crop and fruit are prone to mechanicaldamage and should be handled with care. The robot hasto operate under adverse climatic conditions, such ashigh relative humidity and temperature as well aschanging light conditions. Finally, to be cost effective,the robot needs to meet high performance characteristicsin terms of speed and success rate of the pickingoperation. In this project, these challenging issues havebeen tackled by an interdisciplinary approach in whichmechanical engineering, sensor technology (computervision), systems and control engineering, electronics,software engineering, logistics, and, last but not least,horticultural engineering partake (Van Kollenburg-Crisan et al., 1997; Bontsema et al., 1999; Meulemanet al., 2000).One of the most challenging aspects of the develop-ment of an automatic harvesting machine is to achieve afast and accurate eyehand co-ordination, i.e. to achievean effective interplay between sensory informationacquisition and robot motion control during the pickingoperation, just like people do. In horticultural practice,a trained worker needs only 36s to pick and store asingle fruit and that performance is hard to beat.Fortunately, in terms of picking speed a roboticharvester does not have to achieve such high perfor-mance characteristics. A task analysis revealed that, foreconomic feasibility, a single harvest operation may takeARTICLE IN PRESS1537-5110/$30.00135# 2003 Silsoe Research Institute. All rights reservedPublished by Elsevier Ltdup to 10s (Bontsema et al., 1999). Still, robot motionsshould be as fast as possible while preventing collisionsof the manipulator, end-effector and harvested fruit withthe crop, the greenhouse construction and the robot itself(such as the vehicle and vision system). In a Dutchcucumber production facility, the robot operates in avery tight working environment. Finally, to assure thequality of the harvested fruit, constraints have to beimposed on the travelling speed and accelerations of themanipulator during various portions of the motion path.To achieve the desired eyehand co-ordination, oneneeds (real-time) acquisition of sensory information ofthe environment as well as algorithms to calculatecollision-free motions for the manipulator. In thisproject, the sensory system is based on computer vision,as reported by Meuleman et al. (2000). This paperfocuses on the fast generation of collision-free motiontrajectories for the manipulator of the picking machine.This issue has not received much attention in agricultur-al engineering research despite the fact that considerableresearch effort has been spent on automatic harvestingof vegetable fruits (see e.g. Kondo et al., 1996; Hayashi& Sakaue, 1996; Arima & Kondo, 1999).The paper is outlined as follows. In Section 2, theharvesting robot is described. In Section 3, a tasksequence for a single harvest operation is presented.Then Section 4 presents the components of an automaticalgorithm for collision-free motion planning. To obtaininsight into the operation of the algorithm, in Section 5,the approach is illustrated on a two-degrees-of-freedom(DOF) manipulator. Section 6 contains results of amotion planning experiment with the six-DOF RV-E2Mitsubishi manipulator used in the harvest robot.Finally, Section 7 contains concluding remarks andsuggestions for future research.2. The harvesting robotIn Fig. 1 a functional model of the harvesting robot isshown. It consists of an autonomous vehicle used forcoarse positioning of the harvesting machine in the aislesof the greenhouse. This vehicle uses the heating pipes asa rail for guidance and support. It serves as a mobileplatform for carrying power supplies, a pneumaticpump, various electronic hardware for data-acquisitionand control, a wide-angle camera system for detectionand localisation of cucumbers in the crop and a seven-DOF manipulator for positioning of the end-effector.The manipulator consists of a linear slide on top ofwhich a six-DOF Mitsubishi RV-E2 manipulator ismounted. The RV-E2 manipulator includes an anthro-pomorphic arm and a spherical wrist. The manipulatorhas a steady-state accuracy of ?0?2mm and meetsgeneral requirements with respect to hygiene and theoperation under adverse greenhouse climate conditions(high relative humidity and high temperature). Themanipulator carries an end-effector that contains twoparts: a gripper to grasp the fruit and a cutting device toseparate the fruit from the plant. The end-effectorcarries a laser ranging system or a small camera. Theyare used to obtain sensory information for fine motioncontrol of the end-effector in the neighbourhood of thecucumber, if needed.3. Task sequence of a single harvest operationA task sequence of a single harvest operation ispresented in Fig. 2. Approaching the cucumber duringthe picking operation is considered to be a two-stageprocess. First, with the camera system mounted on thevehicle, the cucumber fruit is detected, its ripeness isassessed and its location is determined. In case it isARTICLE IN PRESS(b)(c)(d)(i) (a) (f)(g)(h)(e)Fig. 1. A functional model of the cucumber harvest robot; (a)vehicle; (b) wide angle camera; (c) seven-degrees-of-freedommanipulator; (d) end-effector; (e) laser ranger and position ofcamera for local imaging; (f) computer and electronics; (g) reelwith 220V power line; (h) pneumatic pump; (i) heating pipeNotationctransition costs between two nodesfcost functiongcost of the motion path from node S to thecurrent nodeGgoal nodehoptimistic estimate of the cost to go to thegoal from the current nodennoden0successor of nodeSstart nodeE.J. VAN HENTEN ET AL.136decided to pick the cucumber, the low-resolution imagesof the vehicle-mounted camera are used for positioningthe end-effector in the neighbourhood of the cucumber.Once the end-effector has arrived in the neighbourhoodof the cucumber, then, using the laser ranging system orcamera system mounted on top of the end-effector, high-resolution information of the local environment of thecucumber is obtained for the final accurate approach ofthe cucumber. The end-effector grips and cuts the stalkof the fruit. The gripper fixes the detached fruit andfinally the harvested fruit is moved to the storage crate.Obstacle avoidance motion planning will be used bothfor the initial approach of the cucumber as well as thereturn journey of the harvested cucumber to the crate, toassure that other objects in the work space such as therobot vehicle itself but also stems and, if present, leavesand parts of the greenhouse construction are not hit.Clearly, the harvested cucumber increases the size ofthe end-effector, which should be considered during thereturn motion of the manipulator to the storage. Theaverage length of a cucumber is 300mm.4. A collision-free motion planning algorithmFigure 3 illustrates the components of a program thatautomatically generates collision-free motions for thecucumber picking robot based on the work of Herman(1986). Collision-free motion planning relies on three-dimensional(3D)informationaboutthephysicalstructure of the robot as well as the workspace in whichthe robot has to operate. So, the first step in collision-free robot motion planning is the 3D world descriptionacquisition.Thisdescriptionisbasedonsensoryinformation such as machine vision as well as a prioriknowledgeabout,forinstance,the3Dkinematicstructure of the harvesting robot, e.g. 3D models,contained in a database. With this information, duringthe task definition phase, the overall task of the robot isplanned. It is decided which final position and orienta-tion of the end-effector result in the best approach of thecucumber. Also specific position and orientation con-straints are defined during this phase. In the inversekinematics phase the goal position and orientation of theend-effector, defined during task definition, are trans-lated into the goal configuration of the manipulator.The goal configuration is represented as a combinationof a translation of the linear slide and six rotations ofthe joints of the seven-DOF manipulator. This informa-tion is used by the path planner. The path planneremploys a search technique to find a collision-free pathfrom the start configuration of the manipulator to itsgoal configuration. Once the collision-free path planninghas been completed successfully, the trajectory plannerARTICLE IN PRESSInitialise allsystemsMove vehicledelta x furtheron the railTake stereoimagesImageprocessing(fruit detection)Camera at endposition?Path planningMove TCP tocutting pointGrip fruit andcut stalkMove with fruitto the crate andrelease itMove TCP tohome positionHarvestcompletedMove vehicle tohome positionyesMove globalcamera to 1stpositionMove globalcamera oneposition furthernoyesnoyesyesnoyesno3D localisationnoMove TCP closeto cutting pointMini camerastereo imagesImage processing(high res. 3Dlocalisation)Fruit(s) detected?Vehicle at endof rail?Other fruit inglobal image?Fruit mature?Fig. 2. Task sequence of a single harvest operation: 3D, three dimensional; TCP, tool-centre-pointCUCUMBER PICKING ROBOT137converts the collision-free path into a trajectory that canbe executed by the manipulator. Typically, the pathplanning process is concerned only with collision-freeconfigurations in space, but not with velocity, accelera-tion and smoothness of motion. The trajectory plannerdeals with these factors. The trajectory planner producesthe motion commands for the servomechanisms of therobot.Thesecommandsareexecutedduringtheimplementation phase.Some components of the motion planning system willbe described hereafter in more detail.4.1. World description (acquisition)The machine vision based world description acquisi-tion for the cucumber picking robot is described in apaper by Meuleman et al. (2000). The vision system isable to detect green cucumbers within a green canopy.Moreover, the vision system determines the ripeness ofthe cucumber. And finally, using stereovision techni-ques, the camera vision system produces 3D maps of theworkspace within the viewing angle of the camera. Inthis way the robot is able to deal with the variability inthe working environment with which it is confronted.As stated above, also a priori knowledge about, forinstance, the physical structure of the robot is needed forcollision-free motion planning. As an example, Fig. 4shows a 3D model of the six-DOF Mitsubishi RV-E2manipulator implemented in MATLAB1. The 3Dstructure of the manipulator is represented by polygonsconstructed of rectangles and triangles. The model isused for evaluation of motion strategies during simula-tions and serves as a basis for the detection of collisionsof the manipulator with structural components in theworkspace of the robot during motion planning.4.2. Inverse kinematicsThe inverse manipulator kinematics deal with thecomputation of the set of joint angles and translationsthat result in the desired position and orientation of thetool-centre-point (TCP) of the manipulator (Craig,1989). The TCP is a pre-defined point on the end-effector. For the six-DOF Mitsubishi RV-E2 manip-ulator Van Dijk (1999) obtained an analytic solution ofthe inverse manipulator kinematics. For the seven-DOFmanipulator, i.e. the Mitsubishi RV-E2 manipulatormounted on a linear slide, a straightforward analyticsolution of the inverse kinematics does not exist due totheinherentredundancyinthekinematicchain.Recently a mixed analytic-numerical solution of theARTICLE IN PRESS1000800600400200050050050050000X plane, mmY plane, mmZ plane, mmFig. 4. A three-dimensional model of the six-degrees-of-freedomMitsubishi RV-E2 manipulatorStartWorld descriptionacquisitionInversekinematicsPath planningTrajectoryplanningImplementationEndData baseTask definitionFig. 3. A program for automatic generation of collision-freemotionsE.J. VAN HENTEN ET AL.138inverse kinematics of this redundant manipulator wasobtained (Schenk, 2000). Given the position of the ripecucumber,thealgorithmproducesacollision-freeharvest configuration of the seven-DOF manipulator.Also, it assures that the joints have sufficient freedom forfine-motioncontrolintheneighbourhoodofthecucumber.4.3. Path plannerAlgorithms for collision-free path planning have beenthe object of much research. See e.g. Latombe (1991)and Hwang and Ahuja (1992) for an overview.A collision-free path planner essentially consists oftwo important components: a search algorithm and acollision detection algorithm. The search algorithmexplores the search space for a feasible, i.e. collision-free, motion from a start point to a goal point. Duringthe search, the feasibility of each step in the search spaceis checked by the collision detection algorithm. Thisalgorithm checks for collisions of the manipulator withother structural components in the workspace of therobot.It is important to note that for most path planners thesearch space is the so-called configuration space of therobot, which crucially differs from the 3D workspace ofthe robot. In case of the seven-DOF manipulator of thecucumber harvest machine, the configuration space is aseven-dimensional space spanned by the combination ofone joint translation and six joint rotations. Then, thesearch for a collision-free motion from a start positionand orientation of the tool-centre-point to the goalposition and orientation in the 3D workspace boilsdown to a search for a collision-free motion of a singlepointthroughtheseven-dimensionalconfigurationspace from the start configuration to the goal config-uration. In this way problems with redundancy in thekinematic chain are easily circumvented. There is a one-to-one mapping of a point in the configuration space toa position and orientation of the tool-centre-point in theworkspace.However,formostmanipulators,theopposite does not hold. Then a single position andorientation of the tool-centre-point in the workspace canbe reproduced by several configurations of the manip-ulator. Due to its unique representation a search in theconfiguration space is preferred. For collision detection,however, one needs to describe the physical posture ofthe manipulator in relation with other objects in the 3Dworkspace. Because every configuration represents asingle posture of the manipulator in the 3D workspace,collisionscanbeeasilyverified.Then,giventheparticular kinematic structure of the robot, the work-space obstacles can be mapped into configuration spaceobstacles as will be shown later.4.3.1. The search algorithmA path search algorithm should be efficient and find asolution if one exists. The latter property is referred to ascompleteness (Pearl, 1984). Usually, algorithms thatguarantee completeness are not computationally effi-cient. Computational efficiency, however, is crucialwhen on-line application is required. To obtain insightinto the various aspects of motion planning, in thisresearch, completeness of the algorithm was favouredabove computational efficiency. The main reason forthat choice is that a complete algorithm will either findthe solution or stop using a clearly defined stoppingcriterion if a solution cannot be found. This is not truefor algorithms that do not assure completeness. Theyeither supply a solution or get stuck without furthernotice. In this research the so-called A*-search algorithmwas used (Pearl, 1984; Kondo, 1991; Russell & Norvig,1995). It is easy to implement and guarantees complete-ness. Moreover, it minimises a cost criterion thatincludes a measure for the distance travelled in thesearchspace.ThealgorithmwasimplementedinMATLAB1.To use the A*algorithm for motion planning, theconfiguration space of the robot is discretised using afixed grid structure as shown in Fig. 5. The user candefine both the size and resolution of the grid. Then theA*algorithm searches a path from the start grid point tothe goal grid point while minimising a cost function f:This cost function f includes the cost of the path so farg; and an optimistic estimate of the cost from the currentposition to the goal h: In this research, the Euler normwas used as an optimistic estimate of the cost to go tothe goal node. The A*algorithm is both complete andoptimal. Optimality assures that the path obtainedminimises the cost function used.ARTICLE IN PRESSSGStartconfigurationGoal configurationTranslation or rotation of joint 2Translation or rotation of joint 1Fig. 5. Orthogonal node expansion in a discretised two-dimen-sional configuration space: S; start node; G; goal nodeCUCUMBER PICKING ROBOT139The A*algorithm uses two lists with grid nodes, theOPEN list and the CLOSED list. The OPEN listcontains grid nodes of which the cost function has notyet been evaluated, whereas function values of the gridnodes on the CLOSED list have been evaluated. It isassumed that the start and goal configuration coincidewith grid nodes or that grid nodes in the closeneighbourhood of these configurations can be selected.Then according to Pearl (1984), the A*algorithmmanipulates the nodes in the grid as follows.(1) Put the start node S on OPEN.(2) If OPEN is empty then exit with failure, else removefrom OPEN and place on CLOSED a node n forwhich f is a minimum.(3) If n is equal to the goal node G; exit successfully withthe soluti
收藏