{"id":2,"date":"2014-05-14T09:59:48","date_gmt":"2014-05-14T09:59:48","guid":{"rendered":"http:\/\/roboimagedata-webtest.compute.dtu.dk\/?page_id=2"},"modified":"2018-02-19T16:27:42","modified_gmt":"2018-02-19T14:27:42","slug":"sample-page","status":"publish","type":"page","link":"http:\/\/roboimagedata.compute.dtu.dk\/","title":{"rendered":"Overview"},"content":{"rendered":"<p>This homepage is an entry point to the data sets\u00a0generated with our experimental setup shown in one of its configurations in the\u00a0image below. This setup consists of an industrial ABB robot encaged in a black box to combat light pollution (doors are closed during operations)<\/p>\n<div id=\"attachment_18\" style=\"width: 910px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/roboimagedata-webtest.compute.dtu.dk\/wp-content\/uploads\/2014\/05\/RobotImageS2.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-18\" class=\"wp-image-18 size-full\" src=\"http:\/\/roboimagedata.compute.dtu.dk\/wp-content\/uploads\/2014\/05\/RobotImageS2.jpg\" alt=\"An image of our experimental set up for making image data via a robot.\" width=\"900\" height=\"667\" \/><\/a><p id=\"caption-attachment-18\" class=\"wp-caption-text\">An image of our experimental set up for making image data via a robot.<\/p><\/div>\n<p>The general idea behind this setup is, that we can control the camera position and light (note the LED&#8217;s in the box&#8217;s ceiling) from a computer, whereby we can make large amounts of high quality data. In addition to controlling the light and the camera, we have until now also included a structured light scanner, which allows for capturing a reference 3D surface geometry of the viewed scene\/object. This is particularly relevant when evaluating image matching, since the optical flow or image correspondences, can be determined from known camera and scene geometry &#8211; which we thus provide.<\/p>\n<div id=\"attachment_34\" style=\"width: 594px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/roboimagedata-webtest.compute.dtu.dk\/wp-content\/uploads\/2014\/05\/robotStereo_red.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-34\" class=\"wp-image-34 size-full\" src=\"http:\/\/roboimagedata.compute.dtu.dk\/wp-content\/uploads\/2014\/05\/robotStereo_red.jpg\" alt=\"The structured light stereo head, we mounted on the robot arm, when compiling our multiple view stereo data set. This head allowed us to have a structured light scan from every position an image of the data set was take from.\" width=\"584\" height=\"389\" \/><\/a><p id=\"caption-attachment-34\" class=\"wp-caption-text\">The structured light stereo head, we mounted on the robot arm, when compiling our multiple view stereo data set. This head allowed us to have a structured light scan from every position an image of the data set was take from.<\/p><\/div>\n<h2>Calibration<\/h2>\n<p>While the positioning accuracy of our robot is difficult to control, the repeatability is very high\u00a0with a very little stochastic part. This implies that\u00a0running a\u00a0given positioning script several times, the positioning will\u00a0be (almost) identical\u00a0every time.<\/p>\n<p>To address this positioning issue, we do not directly use (or report) the camera positions sent to the robot, but instead determine and report the relative camera positions we get. This is done via the <a href=\"http:\/\/www.vision.caltech.edu\/bouguetj\/calib_doc\/\">Camera Calibration Toolbox for MatLab<\/a>.<\/p>\n<div id=\"attachment_47\" style=\"width: 1610px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/roboimagedata-webtest.compute.dtu.dk\/wp-content\/uploads\/2014\/05\/CalImageSample.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-47\" class=\"wp-image-47 size-full\" src=\"http:\/\/roboimagedata.compute.dtu.dk\/wp-content\/uploads\/2014\/05\/CalImageSample.png\" alt=\"An image of a calibration sheet used to determine the internal and external camera parameters.\" width=\"1600\" height=\"1200\" \/><\/a><p id=\"caption-attachment-47\" class=\"wp-caption-text\">An image of a calibration sheet used to determine the internal and external camera parameters.<\/p><\/div>\n<h2>Available data sets<\/h2>\n<p>At present we have two available data sets:<\/p>\n<ul>\n<li>One aimed at <a title=\"Point Feature Data Set \u2013 2010\" href=\"http:\/\/roboimagedata.compute.dtu.dk\/?page_id=24\">evaluating point features<\/a>, made in 2010 with a journal publication appearing in 2012.<\/li>\n<li>One aimed at evaluating <a title=\"MVS Data Set \u2013 2014\" href=\"http:\/\/roboimagedata.compute.dtu.dk\/?page_id=36\">multiple view stereo<\/a>, made in 2013 and published in 2014.<\/li>\n<\/ul>\n<p>These data sets are described further in our <a title=\"Publications\" href=\"http:\/\/roboimagedata.compute.dtu.dk\/?page_id=39\">published papers<\/a>, and are available freely as citeware, i.e. if you use them you cite the related work.<\/p>\n<p>We are currently working on more data sets and plan to publish them here, when they are done and we have gotten\u00a0a describing publication accepted &#8211;\u00a0 such that we can get the necessary academic credit.<\/p>\n<h2>People<\/h2>\n<p>Many people have been involved in the making and processing of these data sets, however, the two primary responsible are <a href=\"http:\/\/www.imm.dtu.dk\/~aanes\/\">Henrik Aan\u00e6s<\/a> and <a href=\"http:\/\/people.compute.dtu.dk\/abda\/\">Anders Dahl<\/a>, both associate professors at the section for <a href=\"http:\/\/www.compute.dtu.dk\/english\/research\/Image\">Image Analysis and Computer Graphics<\/a>\u00a0at the\u00a0<a href=\"http:\/\/www.dtu.dk\">Technical University of Denmark (DTU)<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This homepage is an entry point to the data sets\u00a0generated with our experimental setup shown in one of its configurations in the\u00a0image below. This setup consists of an industrial ABB robot encaged in a black box to combat light pollution &hellip; <a href=\"http:\/\/roboimagedata.compute.dtu.dk\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-2","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=\/wp\/v2\/pages\/2","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2"}],"version-history":[{"count":33,"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=\/wp\/v2\/pages\/2\/revisions"}],"predecessor-version":[{"id":162,"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=\/wp\/v2\/pages\/2\/revisions\/162"}],"wp:attachment":[{"href":"http:\/\/roboimagedata.compute.dtu.dk\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}