If you want to upload your data to Mapbox, you’ve come to the right place! From GeoTIFFs to Shapefiles, whether you want to edit your data or style it on a map, this guide outlines the different types of data you can upload, techniques for uploading, and common pitfalls and how to troubleshoot them.
Datasets vs. tilesets
Datasets provide access to feature geometries (points, lines, and polygons) and properties (attributes), both of which can be edited in the Mapbox Studio dataset editor or through the Mapbox Datasets API.
Tilesets are lightweight collections of vector data that are optimized for rendering and are not editable but can be styled in the Mapbox Studio style editor.
Techniques for uploading datasets and tilesets are listed below. The size of your data file will affect how much can be transferred at one time. See the section on transfer limits to know which method is best.
To add your data to a dataset, you can create a new, blank dataset through Mapbox Studio or through the Mapbox Datasets API and then add data to it. Note that the Mapbox Datasets API does not date GeoJSON files as uploads, but rather as the body of a POST request. See the Mapbox Datasets API documentation for more information.
You can upload your data as a tileset through:
Accepted file types and transfer limits
The accepted file types and transfer limits for dataset and tileset uploads include:
|File type||Datasets||Tilesets||Transfer limits|
|CSV||5 Mb for datasets, 1 GB for tilesets|
|GeoJSON||5 Mb for datasets, 1 GB for tilesets|
|KML||260 Mb with 15 layers or fewer|
|Shapefile||260 Mb (combined uncompressed size of
If your file size exceeds these limits, see the Troubleshooting section below.
There are a couple requirements for TIFFs:
- Only 8-bit GeoTIFFs are supported. Run
gdalinfoto find your GeoTIFF’s resolution.
- Mapbox only accepts TIFFs with georeferencing information (GeoTIFFs). Make sure your TIFF is georeferenced before trying to upload.
If you are attempting to upload large TIFFs (multi GBs), here are some ways you can optimize your TIFF before uploading:
- Reproject to Web Mercator
- Set blocksize to
- If compression is needed, use
- Remove Alpha band, if applicable.
Upload failures directly in Mapbox Studio typically occur for two reasons:
- There’s an explicit issue with your data.
- The data has failed to process within one hour (two hours for MBTiles files).
If there is an explicit issue with your data, you will receive a descriptive error message when the upload fails. Each message includes an error code that is described in full below. If your upload times out, read through the troubleshooting recommendations below.
Tileset upload errors
||You tried to upload a zipfile that did not include one of the files that make up a shapefile:
||Make sure your zipfile contains each of these files.|
||Your data source may not have any layers.||Check your data source in QGIS or use ogr2ogr to make sure it is correct.|
||The TileJSON file in your MBTiles upload contains too much information.||Remove extra or unneeded content from the TileJSON file (inside the MBTiles file).|
||Your KML file is empty.||Make sure you have some layers in your data that are readable. Here’s an example of a valid KML file.|
||Your KML file has more than 15 layers. This can happen when combining a number of files into one.||Make sure to split up your layers into different files before uploading to Mapbox Studio. If you must use KML and require all the layers, check out
||Your MBTiles file has more items than the limit allows.||Try adjusting & limiting your zoom levels.|
||The uploaded dataset was deleted during processing into a tileset.||Try uploading the dataset again, and do not delete the dataset until the tileset processing has successfully completed.|
||The size of a specific tile in the MBTiles file is too large.||Reduce the detail of data at this zoom level or omit it by adjusting your minzoom.|
||The size of the grid tile in the MBTiles file is too large.||Reduce the detail of data at this zoom level or omit it by adjusting your minzoom.|
||Your GeoJSON file has invalid syntax.||Make sure your GeoJSON is compliant with the GeoJSON specification. You can validate your GeoJSON with GeoJSON Lint.|
||Mapnik is unable to process the GeoJSON file, likely due to invalid syntax.||Make sure your GeoJSON is compliant with the GeoJSON specification. You can validate your GeoJSON with GeoJSON Lint.|
||The coordinates in your file are beyond the extend of Web Mercator.||Check to see that your coordinates are in the correct order ([longitude, latitude]). Try visualizing your GeoJSON in geojson.io to see if geometries appear where you expect. If they do, try reprojecting to Web Mercator prior to uploading.|
||You have one of the following:
- Bad TIFF files that are missing necessary information.
- Invalid MBTiles where the MBTiles is not recognized as an SQLite database.
- Invalid MBTiles table data where the MBTiles has data in the
- tmz2 files have been double zipped meaning the
|- Try running
- Make sure your MBtiles file conforms the MBtiles specification.
- Check that the file has only been zipped once.
||While generating tiles from your upload, at least one tile was larger than 5MB, which is too large.||Try simplifying your data where it is most dense. This typically happens with CSV point datasets where there are many millions of points in a single tile. Try using Tippecanoe to simplify and cluster points before uploading.|
||One of the tiles in your MBTiles file is invalid according to the Mapbox Vector Tile Specification||We need to make sure all vector tiles conform to the specification, so this ensures any encoders are doing their job correctly. If you encounter this, please reach out to email@example.com with the error message and we’ll help you move forward.|
Dataset upload errors
Most dataset upload errors are related to syntax. Be sure to check your data for syntax errors before uploading. If you are working with GeoJSON data, consider using a tool like GeoJSON Lint to lint your data before uploading. If your error is specifically related to a CSV upload, you can view our CSV file errors troubleshooting guide or investigate further by checking out the library we use to convert CSV files to GeoJSON.
||Your dataset contains a crs attribute.||Remove the crs attribute from your data before uploading.|
||Your dataset contains one or more GeometryCollections and/or a geometry that is set to null.||GeometryCollections and null geometries are not supported and must be removed from your dataset.|
If you receive a
Processing timed out. message after a lengthy “processing” status, it is likely because your file has taken more than one hour (two hours for MBTiles files) to process and has timed out. To keep our upload queue fresh, we limit the time it takes for particularly large uploads. The following techniques can be used to update your data to improve processing time.
Note: the troubleshooting advice here mostly relates to tilesets, although some may be applicable to datasets as well. If you are having trouble uploading datasets and your issue is not listed here, please contact support.
Reproject to Web Mercator
During upload processing, we reproject all geometries to Web Mercator (EPSG:3857) before encoding into vector tiles. During the vector tile encoding process, if your data isn’t Web Mercator, each vertex must be reprojected during encoding, which can take a long time.
We suggest reprojecting your data before uploading to skip this process and speed up your upload. Here’s how you can reproject your data with open source tools:
ogr2ogrcommand line utility. The following example is how to convert a Shapefile to Web Mercator.
ogr2ogr output.shp -t_srs "EPSG:3857" input.shp
QGIS allows for reprojection -
Right-click your layer -> Save As -> Select "Web Mercator EPSG:3857" as the output projection.
Multipart to singlepart
Multipart geometries can be complex – a single feature can be comprised of hundreds of thousands of polygons. These complex multipart geometries increase processing time and lead to timeouts.
To improve processing speeds, you can break each polygon into its own unique feature (singlepart) using QGIS. This will reduce the complexity per feature and allow the data to process faster. Note that each individual singlepart feature will share the attributes of the original feature.
population: 100 is duplicated. If you plan on styling based on attributes such as this, be wary of splitting into singleparts!
There are a couple of helpful tools for doing this:
- In QGIS you can use either the
Vector -> Geometry Tools -> Multipart to singlepartsor the Multipart Split plugin.
- If you are using GeoJSON and Node.js, you can use the geojson-flatten module.
Simplifying your data means removing complexity in the vertices of your geometry. Each vertex must be translated to vector tile coordinates. The fewer vertices to translate, the faster processing becomes. Often you can simplify your data without any visual change. It’s important to watch out for oversimplification, though! Oversimplifying could remove important granularity in your data as well as potentially create invalid geometries if lines begin overlapping.
Simplification tools typically take a tolerance parameter to specify how much to simplify. Some tools to use for simplifying data:
- QGIS vector simplification -
Vector -> Geometry Tools -> Simplify geometries
- Turf.js simplify
Limit large features
Large features that span the entire dataset can slow down processing.
For example, consider this dataset of Hawaii. It contains a handful of smaller polygons that represent the islands. It also contains a large polygon that represents the surrounding water. Since the bounding box of the water polygon will intersect with nearly all the tile boundaries (grey lines), the water polygon will need to be processed for nearly every tile within this tileset.
There is no exact solution for this, since it largely depends on the dataset and how you plan to style and use the data.
Some possible solutions include:
Remove the large polygon if it’s not necessary for your use case.
Split the large polygon into smaller polygons: After creating a digitized layer of smaller polygons, use that digitized layer to intersect with the large polygon and split it into pieces. Then add the newly split feature into your original dataset.
QGIS geometry intersection -
Vector -> Geoprocessing Tools -> Intersection
Caution: This could create unwanted polygon borders, depending on how you plan to style the dataset.
Slice large contour datasets
Large contour datasets can be particularly complex. Often they will have long, single feature linestrings wrapping across the entire dataset. Like the large polygons above, these can take a long time to process.
We recommend using GRASS’s
v.split function via QGIS to break lines into shorter, equal segments. Smaller geometries will improve processing speed. If the contour data is highly detailed (as in, requires zoom 22) we recommend breaking lines every 5 kilometers.
Pre-generate tilestats for MBTiles
We generate summary documents, known as tilestats, for uploads so Mapbox Studio can see what types of data and properties are in your spatial data. This takes quite a long time for large MBTiles files and can lead to timeouts. If you are using Tippecanoe to generate your MBTiles file you can bypass this step by using version 1.21.0 or later of Tippecanoe, which pre-generates a tilestats object. This can cut upload times in half.
If you aren’t using Tippecanoe, you can still use the
tile-join operation provided by Tippecanoe to generate the tilestats document. Make sure to at least use version 1.22.0.
tile-join -o with-tilestats.mbtiles original.mbtiles