Speaker 1: This is a complete accuracy assessment of the iPhone 15 Pro's camera and LiDAR sensors in the context of geospatial mapping in comparison to a high-accuracy surveying total station. Okay, and we're going to start collecting data. And as you can see in the live view, this is the LiDAR point cloud. Of course, imagery is being taken at the same time, and we'll be able to use those images. The images can then be processed and generate a photogrammetric point cloud. Right now, what you're seeing is the live LiDAR point cloud being generated using the iPhone 15 Pro. I'm going to go inside here, try to get as much of the inside of the island as well. We're going to get the fire hydrant, okay, and let's stop it here, alright. So what we have here is the trajectories of our iPhone 15 Pro, and this is the LiDAR point cloud that we've generated, and these right here are all of the images taken, 444 calibrated camera positions that we can generate a photogrammetric point cloud and a 3D model. Now let's survey this with a total station. Now for the control points, I'm going to be setting two stakes here using a GNSS receiver. Now both of these stakes here have a nail in the center to ensure that we are identifying the center of the stake every time that we're observing it using the GNSS receiver or the total station. Now I'm going to set the first stake right here, and then from over there, I'm going to come to this side of my yard where I'll be setting a second stake for my back site. Alright, so I've got my GNSS receiver here, I'm using the Leica GS18i, and again the reason I'm using this GNSS receiver is to establish geodetic coordinates for where the total station is going to be as well as the back site reading. I'll name this job iPhone 15 Pro. Now for my antenna height, I am sitting at 1.8 meters, which is equivalent to 5.905 feet. We're going to go ahead and start our RTK stream, and now we're getting RTK corrections. So now let's observe both of the points. We'll start with our back site point, and this will be point number one, and we'll measure. Cool. Now let's shoot the second point. I really don't feel like moving these tripod legs.
Speaker 2: Alright, here we go.
Speaker 1: Using the IMU, I can use tilt compensation to hold the rod like this and still get an accurate reading. If you don't believe me, then check out this video in the corner. And measure.
Speaker 2: Cool.
Speaker 1: Alright, let's set up our total station. Now this total station is going to provide us with the highest accuracy in terms of relative distances by utilizing angles and distances to calculate coordinates, and by having geodetic coordinates on our control points, we're going to be able to calculate geodetic coordinates for all of the measurements that we take so that we can measure the absolute accuracy of the iPhone 15 Pro. Okay, now that I have my total station set up, I'm going to back out and switch over to TS only. I'm going to set our back site here. And the height of our prism is at 1.741 meters, which is equivalent to 5.712 feet. So now I can go over to set up with a known back site. This right here is actually point number two. And our instrument height, 4.64, okay? And we are back siting point number one, and we have a target height of 5.712. Take a distance. Okay, we are within one hundredth of a foot. So I'm pretty happy with that, so I'm going to go ahead and say store. And we're going to do a 180 degree flip. Same thing, we're within a hundredth, so I'm going to say stop and set our angle. By doubling our angle, we're able to see our position at both 90 degrees vertical angle as well as 270 degrees vertical angle. So now just like we did with the iPhone 15 Pro, I'm going to be surveying the island over here on the cul-de-sac so that we can validate the accuracy of the mapping capabilities when it comes to feature extraction. Alright, ready? Let's start collecting data.
Speaker 2: Point store.
Speaker 1: Two very boring minutes later. And one last reflector-less shot on the tree. And store. Point store. Taking a look here, we can see all of the points that we collected. We basically did curb and gutter all the way around the island, and then five ground shots as well as the top of the hydrant and the tree. Stick around to the end of the video to see the accuracy of the iPhone 15 Pro when it comes to feature extraction. Now I want to run a linear accuracy test using the iPhone 15 Pro, and I want to see how much drift there is in accuracy while using this iPhone and over how much distance. Now this is reflective tape that I bought from Home Depot. The reason that I'm using this is so that when I place this on the road, I can see it on my LiDAR sensor in intensity mode, and I can validate the accuracy of the LiDAR point cloud just like I can with the imagery in the photogrammetry point cloud. Now I'm going to make a chevron marker like this one every 10 feet for 100 feet. So we'll have 11 markers in total. All right, let's get started. Okay, there's one. Okay, there's two.
Speaker 2: A little longer than a few minutes later.
Speaker 1: Okay, finally I got all of these points set 100 feet down, and now let's measure their positions using the total station.
Speaker 2: Point stored. Point stored. Point stored.
Speaker 1: All right, now let's scan it with our iPhone. Okay, and we are going to stand like this and start collecting data. Nice and easy, and we are at 100 feet. Done. Okay, and we have our linear test right here with almost 200 images, so this looks good. And now let's take all of this data, head inside so that we can process it and see how accurate is the iPhone 15 Pro. Hello, and welcome to my hotel room in Berlin, Germany. I filmed this video a few weeks ago, and I was supposed to finish it and upload it before coming to Intergeo. Of course, I got a little busy and did not have the time to finish, and I'm sure everyone is anxious to see what the results are, so here you go. I started by processing the images that we captured with the iPhone 15 Pro and combined that data with the LiDAR data to get a fused point cloud. I did all this on Pix4D-Matic, and here's how it turned out. For the island survey, it actually turned out pretty nice. I can see plenty of detail for the curb and gutter, and a pretty seamless integration between both data sets. The construction of the fire hydrant was not bad, although I do see a little bit of noise on the outside of the fire hydrant, but I can clearly see the top of it if I wanted to assign coordinates for a benchmark. Now, we took that model from Pix4D-Matic, and we brought it into Pix4D-Survey. Now, the reason I want to do this is because I want to create a surface, and I don't want to use the millions of points that are in the point cloud, so I'm going to simplify this point cloud by using grid points. Now, out of the millions of points in the fused point cloud, I generated about 1,500 points in a grid, and that grid looks something like this. So, all these little blue plus signs are a point on the ground. Notice that there are no points on the fire hydrant or the tree or this little pot. That's because we only want terrain points in order to generate our surface. I also used the low pass filter to ensure that I'm grabbing the lowest points on the point cloud so I'm not picking up on any vegetation. Once we generated grid points, I'm able to then create a surface, and here's what the surface looks like. And this is the surface that I'm going to use in comparison to a surface generated using a surveying total station. Now, the points that I collected with the total station I brought into AutoCAD Civil 3D, and I created a standard CAD drawing. Right here are all of the points that I brought in. Control point 2 is where the total station was set up. Control point 1 was where the back site was. And these are the points from the linear test with the LIDAR targets. We'll get to that in just a minute. But over here to the right is the island. Now, I've got all of the points back at curb gutter with break lines put in, as well as a couple of ground shots, the fire hydrant, and a tree. If I click here and go to object viewer, this is the surface that was generated with the total station. Now, you can see the lip between the curb and gutter, and it does go up to a high point where this contour line shows us. And the thing is with the total station, the information that we can bring into CAD is very dependent on how much data we collect in the field. Of course, I'm not going to collect millions of points like in a point cloud or even 1,500 points like I did with the grid. I'm only going to collect a handful of points. And I think in this example, I collected just under 50 points. So with 50 points, this is the type of model I was able to create. And this is pretty standard in the industry when you use a total station. Now, let's import that surface that we created in Pix4D survey from our iPhone 15 Pro. Okay, and here is the surface of the iPhone 15 Pro. And obviously, the GPS positioning of the iPhone is not that great. It's several feet off. And actually, if I just turn on a location map here, now, this location map is by no means accurate, but it just gives us an idea of where everything should be. So this right here is the island. And yeah, obviously, the iPhone is a bit shifted. Now, I just want to show you the sheer amount of complexity there is with this iPhone model in comparison to the total station model. What I'll do is I'll turn on the 10 lines, which are the triangles that connect all of the points together. So I'll select one of the surfaces, go over to the editor, and turn on triangles. And there you go. And there you go. You can see the network is so much more complex with the iPhone data set in comparison to the total station data set. And that's just because there's a higher sampling of points. Now, does it mean that because I have less points that my data set is going to be less accurate? Not true. No. The total station is going to have more accurate points, just fewer points to work with. So the iPhone is collecting tons of data, and we're actually sampling it down a little bit, but that by no means means it's more accurate than the total station. Now, I'm going to do my best to align these two data sets together so that I can compare both models and see what the difference is. Okay. And this is, again, the best of my abilities to align these two data sets together. So I'll first create a total station volume surface. So we'll pick the base, and then I can pick the total station surface. I'll say okay. And it says here we've got a fill of 24.47 cubic yards. And then I'll do another one. I'll call it iPhone 15 Pro Volume Base. And this 10 right here is the iPhone surface. And I'll say okay. And I've got 26.88 cubic yards. And doing some quick math here, we've got 26.88 minus 24.47 equals 2.41. We'll divide that by the total station volume, 24.47, multiply it by 100. And it looks like we've got about a 10% difference in terms of surface structure between the total station surface and the iPhone surface. Now, mind you, I am only comparing 50 points to these 1,500 points. So if I wanted to get more precise with this, I would try to take more measurements with the total station to increase the sample size and give us a closer comparison. But 10%, you know, not too bad. We're talking about a $60,000 machine in comparison to a $1,000 smartphone. So not bad for just an iPhone. Now, let's take a look at that linear scan and see how accurate that turned out. Now, I brought the LiDAR point cloud into Cloud Compare in order to visualize the point cloud in intensity mode and see our targets. After some manipulation and visualization, I'm able to see these targets beautifully. And I can clearly identify where the corner of the chevron is. This one turned out even better. So I can see that it's right here. And I'll go over those results in just a minute. But first, I want to show you the fused point cloud using the imagery with photogrammetry and the LiDAR sensor. Again, in Pix4D-Matic, I've generated this model. And you can see there's a huge correction in trajectories here after processing everything. So let's see what the algorithm here suggests the positions are in terms of the positions we captured with the total station. Now, I did set all of these points as checkpoints. And there's a huge shift between where the total station measured and where the actual points are. And we saw this in the civil 3D model with the island. And when it comes to the absolute accuracy, though, it is quite interesting to see what happened. In the first couple of points that we set, we're finding 4.5 to 4 feet in the easting, 2.5 to 2 feet in the northing, and between 9 and 7.5 feet in elevation. But the further along we went with the scan, the absolute accuracy increased. Take a look here when I get down to the last few points. The easting got down to 2 feet. The northing got down to 1 foot. The absolute accuracy, at least in the X and Y direction, increases. Nowhere in here does it actually go down. It actually continues to improve, which suggests that the SLAM algorithm that's being used while capturing data is actually improving the positional accuracy of the entire data set. This is quite interesting. It's actually making me want to do more research to see how this is. But from what I'm understanding here, the further along we go, the more we scan and the more we survey with the iPhone, the better the accuracy is because all of the sensors that are in this phone are working together to improve the estimated trajectories of the entire system. And when it comes to the LiDAR point cloud, we are seeing the exact same thing on Cloud Compare when I compare the coordinates from the point cloud to the total station. For a $1,000 phone, this is the level of accuracy you can expect with an iPhone 15 Pro. If you'd like to learn more about surveying, mapping, and geospatial technology, be sure to subscribe to the YouTube channel. And with that, I will see you guys next time.
Generate a brief summary highlighting the main points of the transcript.
GenerateGenerate a concise and relevant title for the transcript based on the main themes and content discussed.
GenerateIdentify and highlight the key words or phrases most relevant to the content of the transcript.
GenerateAnalyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.
GenerateCreate interactive quizzes based on the content of the transcript to test comprehension or engage users.
GenerateWe’re Ready to Help
Call or Book a Meeting Now