Importer Issues

While the importer usually works well, CSV parsing is notoriously finicky and there are dozens of reasons why you might be getting odd results that usually have to do with the format of the CSV or the data itself. Here are some steps you can take to make sure your import goes as smoothly as possible.

📘

Tip:

It's often helpful to open up your CSV in Atom, Notepad or some other plaintext editor. Weird artifacts like badly encoded characters, extra blank columns, and empty rows at the end of your file are often more obvious if you examine the file outside of Excel. Excel tries to show you what it thinks you want to see, not necessarily what's actually in the file.

Uploading/Processing hangs at 10%

This is almost always caused by one of three different things, which you can usually determine by examining your app log file:

  • Permissions on your storage/private_uploads directory are incorrect and the web server cannot write to the directory to upload your file
  • Something is wrong with the format of the CSV (see the rest of this page for tips)
  • You have an incorrect URL in your .env which is causing the javascript to break. Check your browser's javascript console for clues here.

Make sure your CSV is comma-delimited

If you're on Windows and are having trouble getting your export to use comma delimiters, please see the importer documentation that addresses this.

Check Required Fields

The importer documentation outlines which fields are required for each particular kind of import (asset, people, accessories, etc). If any of the required fields are missing, the import will skip those items and throw a validation error.

Even if, for example, the asset model already exists in the database, you still have to include category information, etc.

Check for Blank Values

Open your file in Atom or Notepad (or any other plaintext editor) and check for the following common problems:

  • Check for blank columns to the right of your data
  • Check for blank rows at the end of your file

Check for Duplicate Column Headers

Each column header must be unique. If column headers are repeated, you may get unpredictable results, or the import could crash out.

If you get an error in your logs like:

InvalidArgumentException: Use a flat array with unique string values

... that means that you either have a repeated column header somewhere, or you have extra blank columns to the right of your data.

Check for Weird Characters, BOMs or Linefeeds

We see this most commonly with files that originated from an older software application like Filemaker, where the characters were not encoded in UTF-8, where BOMs (byte order marks) are added, or where unusual linefeeds or returns are used.

You'll recognize this situation by looking for characters like ’ in your CSV or spreadsheet. For example, if a user's name is O'Something, and it displays in Excel or Notepad as O’Something, there is an encoding issue.

Sometimes you can fix that, sometimes you can't. For more information than any reasonable human would ever want to know about encoding, check out this article.

To handle Windows vs Linux linefeeds, if you have perl installed, you can run the following to convert the line endings:

perl -pe 's/\r/\n/g' < /path/to/filename.csv > /path/to/filename-linefeeds.csv

Check Your Date Format

While the PHP parser tries its best to convert from a variety of date formats, there are some that can be problematic, so it's usually best to stick with yyyy-mm-dd for all dates.

Excel Import and Re-Export

Excel is amazingly good at handling spreadsheets and CSVs. It does a lot of magical things behind the scenes to try to account for encoding anomalies, archaic linefeeds, and so on.

Sometimes an easy win, especially if the data came from something other than Excel originally, will be to try to let it do its job:

  • import the file into Excel
  • save it as an XLS
  • close the file
  • open the new XLS
  • export as a new UTF-8 CSV
  • try importing the new CSV

Try Breaking it into Smaller Chunks

Depending on the memory available in your Snipe-IT install, you may run into memory issues when uploading CSVs with thousands and thousands of records. This is because for each row in your CSV, the importer has to do a lot of work: looking up things like location, user, asset model, etc - and then creating them if they don't exist.

If you run into memory issues, or if your import gets stuck at processing without giving you any useful information in your logs, try breaking your CSV into smaller chunks and see if that helps.

Mac/Linux users can do this via the command line very quickly by running:

split -l 500 filename.csv

This will split your file into separate files with 500 lines each. You'll need to rename them with a .csv file ending and add the header rows to the newly created files, but it can be a lot faster than manually breaking them out via Excel.

Custom Fields Not Importing

If your custom field headers are exactly the same name you used in your Custom Fields section, then your custom fields should import with no issues. (Remember that the custom fields MUST already exist in Snipe-IT for their values to be imported.)

If after importing, you don't see your imported custom fields on your assets, double-check that the asset models the assets are associated with have the right (or any) fieldset associated with them. The importer will import the custom fields, but if the fieldsets aren't associated to the model, they won't display on the screen.

Edit (or bulk edit) your asset models and add an associated fieldset to get the custom fields for your new assets and asset models to display in Snipe-IT.

Imported Assets Not Checked Out

Make sure the column header for the user the asset is supposed to be checked out to is "Full Name", per the importer documentation.