What is data sharing at all? In general, apart from geoinformatics, it’s a decisive moment when two people are exchanging information with each other. What is information? It’s the mix of conclusions and interpretations that we made through the processing of data. So, when you share information, you share data and the best of your knowledge. Data sharing is a communication process. There is a transmitter and there is a receiver, and they exchange information through a determined channel. If you choose this channel incorrectly, it can result in data loss or it can take a lot of time for the data to get to the customer. Now that’s enough of philosophy, let’s see the facts.
So, what does geospatial data sharing mean to us from a business aspect? It means a transaction, during which we deliver maps and 3D models to the customers, or we share information through the subcontractor chain. And this, my dear Reader, is the essence of geoinformatics. If a map or a model can’t be interpreted properly, then the information transfer won’t be successful. Just giving someone a file does not mean that it will be understandable. Usually, you must provide some style to your data to recognise spatial patterns. In other cases, it’s necessary to have proper software background to visualize the geospatial data. The reason we create pretty maps and 3D models and the related information, is to publish them. Publish can mean direct business utilization or direct promotional and marketing purposes. In these depths the quality and nature of the geospatial data sharing is very important. The surveyor and the survey itself will be judged based upon this. In the following I will write about the most common geospatial data sharing solutions, while I share with you pros and cons regarding them (this is information transfer in practice 🙂 ).
Lots of people are still using external storages. In this category we talk about the pendrives, HDDs and the SSDs. The advantage of them is that they’re cheap and are easy to use. I need to highlight when storing huge files, you must pay attention to the appropriate file system. The most common is NTFS (New Technology File System), because in geoinformatics most of the software is based on Windows. Compared to Unix/Linux systems, the NTFS can be said to be weak from a security and organizational point of view. Not to mention the disk usage, but after all it’s still much better than FAT and FAT32 file systems. If you’re interested in file systems, I recommend reading this article. When using hard drives, you must provide the optimal environment for them. You better read the user manual, but in a nutshell, you need to keep it away from heat fluctuation and you need to keep it in a dry place, protected from light. This is to avoid data loss. Also, it can be useful to store those drives and backups in a separate room or even a separate building or premise.
Approach the topic of external storages from a practical point of view and imagine how many problems it can cause if you give your maps or 3D data to your customer by using them. First you copy your survey results to the hard drive. What’s next? You have to set up a personal meeting, or you assign it to a delivery company or send it by post. Personal meetings can be time consuming (since COVID-19 pandemic, it’s even harder to organize a personal meeting). In the latter case you’re relying on the delivery company, and this is quite risky. Not only because the package can be lost through the delivery process, but also it can be damaged. Circumstances like temperature fluctuation, possible liquid damage or even physical impact can lead to data loss, which is something we definitely want to avoid.
Apart from the ceremonial handover of the hard drive, there are other questions to be considered. For example, is it worth it to buy a new hard drive for each project, and give it to the customer? Or do you ask the customer to give it back so you can use it several times by formatting it? If that’s the case, I tell you that formatting will not delete the files completely, which is problematic, because we’re talking about confidential documents and data. (The only solution would probably be a drill or a hammer :D)
By using external drives, you can only give data to customers. If they want to open the files, they will need several special software. For instance, GIS for map data, 3D modeling software for 3D files.
At the end of the day, I wouldn’t encourage you to use this method, because it’s not practical, and you may give the impression of being unprepared and a bit old-school..
There are numerous cloud-based solutions out there. Those that allow you to send large files in email like JUMBOmail, MailBigFile, WeTransfer, and those data sharing platforms like OneDrive, GoogleDrive or DropBox.
These solutions are a bit more advanced than the previous ones. It’s not the user’s problem to choose the appropriate file system, and the data is being kept on a central server, from which the customer can download the files anytime. Most cloud services use monthly or annual licensing with different packages. Some of them are available with a relatively fair price. The reuse (deleting old files and sharing new projects) is not that problematic as it is with external drives.
However, other kinds of issues still can appear in practice. One of them is the question of time. Even though it’s true that cloud platforms give you the opportunity to replace personal meetings and handovers, up- and downloading those large files can take a lot of time. If the customer’s internet speed is low, then it can take hours to download a larger file. Then the customer needs to install the appropriate software to be able to even open the geospatial data. And if he/she finds something which he/she believes to be wrong with your work, you will get his notification much later, because several hours were already spent with up- and downloading those mentioned files.
Another questionable field is security. In this industry we often work with confidential data, which should not be seen or used by any third parties. And this is something that is hard to guarantee related to cloud services. There are rumors about cloud platforms that are using and trading the uploaded files. It’s hard to judge if this is true or not, but sure can be worrying, if you’re working with confidential data.
File Transfer Protocol (FTP) can help eliminate both the security concerns about cloud platforms and physical limits of external storage. By using an FTP server, you can grant access to customers to determine folders by a unique identifier. He/she will be able to upload and download files from that folder, depending on the access you give. These folders can cover a single project, so multiple different customers are able to download data while they cannot see each other’s folders (projects).
This is the most secure way to share your files, especially if you use encrypted authentication by SSL/TLS protocols. This is called FTPS.
Afterall, this may seem to be the best solution, but it has problems. It requires a significant amount of expertise to set up and operate an FTP server. And by using wrong settings, you will fail to secure your data. From a financial point of view, you will need someone who takes the responsibility to configure and maintain the server. Considering the wage of a reliable expert and the server’s operating costs, this solution is significantly more expensive than the previously mentioned ones. What are those costs that incur when building an FTP server? The computer itself doesn’t have to be that strong, but you’ll need at least 1 to 5 TB of storage, because you will work with large files. The server may cost as much as a cloud license would cost for years. Internet with proper bandwidth will be needed, which provides quick access to customers (client side). Professionals use several different internet service providers. If one of them is not working due to maintenance or some other issue, the others will be still available, so the server won’t stop working. Locating the server is also something that I could write a lot about. The point is to isolate the server from every possible threat. These threats can be a soaking roof, flammable electrical equipment, and even people. Just think about what happens if anyone can enter the server room. Don’t need a hacker attack, just a cleaning lady who sweeps and mops, while accidentally pulls some cable out of the computer and forgets to inform you about it, and here you are, waking up in the middle of the night to phone calls from angry customers. Apart from these, you always have to take care of the proper power supply. Buy a battery backup in cases of blackouts and current fluctuations. You will also need scheduled backups on data- and even system levels.
I know that it’s a complicated process to operate a server and it’s also quite expensive. Lots of people are not able to choose this option, mostly because it’s too expensive, so they can’t afford it, or they don’t have the expertise needed. And these are the issues that allowed cloud-based solutions to spread worldwide.
THE WAY OF THE FUTURe Of Geospatial Data Sharing
What do the previously presented solutions have in common? What was that exact thing that all of them were missing? The direct information transfer, the option of display. If you send files and data to a customer who does not have geoinformatics knowledge and expertise, it’s useful to send them by providing the proper displaying environment. You can’t interpret a simple JPG or PNG image without a photo viewing app, and it’s the same with all the GIS files. Even if you send files to a customer with appropriate software background, it will take a serious amount of time to download the files. In some projects, quick and real-time cooperation is essential, when the customer or the subcontractor can see the partial results and data all through the process. For example, imagine how easy it’s to work with your colleagues on a Word document shared via Google Drive. You don’t have to download the files, you don’t have to take care about different versions, so the cooperation and the work becomes simple and comfortable. So, in my opinion, if we talk about data sharing, then data transaction is only one task to be solved. If you want to provide information and value, then you must give the perfect visualization tool.
Even if you create a webGIS, the problem of the downloadable files remains unsolved. You can operate a webserver on an FTP server. By the FTP you provide the data transfer, while by the webserver you provide the display. Now that seems too complicated and expensive right?
Is there a secure data transfer solution which also provides geospatial data display that you don’t have to develop? This question came up when we started to develop the first version of SurveyTransfer’s business model . While this problem concerns almost the whole industry, we decided to not take it lightly and create a solution that makes a lot of people’s lives easier. We still work to create a platform where you can transmit data in a safe, simple, and user-friendly way. Your customers will be able to use SurveyTransfer’s online viewer without having to install it. They can also make measurements and manage maps and 3D files. This is also a collaboration platform where files from different projects can be directly examined without download time, thanks to the built in map view and 3D model space.
In summary: SurveyTransfer is a modern cloud-based data sharing and also a data visualization tool that allows you to interact (measurements, labels) with map and 3D data. Uploaded files can be displayed at the same time so you can create a layer order in both map and 3D views. Think of SurveyTrasnfer as Google Drive for spatial data.
Did you like what you read? Do you want to read similar ones?
If you really like what you just read, you can share it with your friends via our social media sites. 🙂