Skip to content

How to Import Data into SharePoint: Methods, Limits & Tools

A technical guide to importing data into SharePoint: native CSV limits, Power Automate's 256-row default, API throttling, and how to choose the right method for your dataset.

Raaj Raaj · · 18 min read
How to Import Data into SharePoint: Methods, Limits & Tools

Importing data into SharePoint sounds straightforward — upload a CSV, map some columns, done. In practice, you hit a wall of hidden constraints: the list view threshold is configured at 5,000 items, the REST API rejects single uploads over 250 MB, Power Automate silently stops at 256 rows, and non-UTF-8 CSVs produce blank files on upload. SharePoint stores lists as typed columns, manages document libraries with versioned metadata and granular permissions, and represents modern pages as structured sitePage objects — not as uploaded files. When you try to force flat data into this environment without preparation, metadata is lost, API requests are throttled, and lists become unusable.

This guide covers every import method available, the exact limits you'll encounter, the data-mapping behaviors that silently break your structure, and how to choose the right approach based on dataset size and complexity.

For a companion piece on getting data out, see How to Export Data from SharePoint: Methods, Limits & Migration Prep.

Every Way to Import Data into SharePoint

SharePoint accepts data through native UI uploads, automation workflows, direct API calls, dedicated migration tools, and managed migration services. Each method handles different data structures — lists, document libraries, page content, metadata — with different fidelity.

Method List Data Documents Metadata Attachments Best For
CSV/Excel Import (UI) ✅ (≤5K rows per view) Partial (column mapping) Small lists, quick imports
Edit in Grid View ✅ (copy-paste) Limited Ad-hoc data entry
Power Automate ✅ (row-by-row) Recurring syncs, medium datasets
Microsoft Graph / REST API Custom bulk migrations
SPMT On-prem → Online moves
ShareGate Tenant-to-tenant, reorganization
AvePoint Fly Enterprise, multi-source
ClonePartner Complex, high-volume, zero downtime

Native Import Methods: CSV, Excel, and File Uploads

Importing a Spreadsheet into a SharePoint List

The most common import path is uploading a CSV or Excel file directly into a SharePoint list. Here's what actually happens:

  1. Navigate to the target SharePoint site → Site ContentsNewListFrom Excel.
  2. Select your .xlsx or .csv file.
  3. SharePoint creates a new list with columns mapped from your spreadsheet headers.
  4. Column types are inferred — and frequently wrong. A column containing "12345" may become Number when you need it as Text. Dates in non-US formats (DD/MM/YYYY) often get misinterpreted. Currency values may land as plain numbers.

Microsoft documents this path for tables up to 5,000 rows. The uploaded workbook is saved in the site's Site Assets library. It is not available in GCC High or DoD. (support.microsoft.com)

One useful nuance: this import path can create a list without the default Title column, which is handy when your source already has its own primary field.

Warning

The 5,000-item view threshold is not a storage limit. You can store up to 30 million items or files in a list or library. The list view threshold is configured at 5,000 items. When the defined item limit in a list is exceeded, the message is displayed. Your data still exists — but the UI, PowerApps connections, and certain operations break once a single view tries to render more than 5,000 rows.

Creating a List from CSV

CSV import works similarly to Excel import, but with a more explicit column-type selection step during the wizard. The uploaded .csv is also stored in Site Assets. This creates a flat SharePoint list — it does not preserve hierarchy, relations, attachments, comments, permissions, or page layouts. Treat it as a convenience importer, not a migration engine. (support.microsoft.com)

Edit in Grid View (Copy-Paste)

For quick, small imports: open your list, switch to Edit in Grid View, and paste rows directly from Excel. This works for under ~100 rows. Beyond that, the browser starts choking. Column types must already exist and match — there's no auto-creation of columns during paste.

Microsoft is explicit that you cannot attach a file to list items when working in grid view. That matters because many operators discover too late that their CSV import covered the rows but not the binary evidence attached to those rows. (support.microsoft.com)

File Upload to Document Libraries

The maximum file upload size for SharePoint Online is 250 GB via browser or OneDrive sync. But programmatic uploads behave differently — if you attempt to go over the API limit and catch the WebException, you will see: "The request message is too big. The server does not allow messages larger than 262144000 bytes." That's the 250 MB single-request API ceiling. For larger files, you must use file chunking via the StartUpload / ContinueUpload / FinishUpload endpoints.

Files attached to a list item are limited to 250 MB, which applies to both Microsoft Lists and SharePoint lists. The total decoded path limit is 400 characters.

Uploading HTML, Markdown, or exported Google Docs files stores them as files. It does not recreate modern SharePoint pages, sections, or web parts. Modern pages are handled as sitePage objects with canvasLayout — a separate construct entirely. (learn.microsoft.com)

Importing Managed Metadata Term Sets

SharePoint supports importing term sets via the Term Store Management Tool using a UTF-8 CSV file. The hierarchy can go up to seven levels deep. There is a known issue in SharePoint Online where the term store tool can fail when importing more than 5,000 terms — Microsoft recommends CSOM for larger term sets. (support.microsoft.com)

This matters for import planning: if your source data uses taxonomy-driven columns (managed metadata), you must seed the term store before importing list items that reference those terms.

Power Automate and API Constraints

When native methods fail, teams turn to automation or custom scripts. You need to understand Microsoft's throttling architecture before building these solutions.

Power Automate's 256-Row Default

By default, the 'List rows present in a table' action in Microsoft Power Automate only retrieves the first 256 rows from a Microsoft Excel file. This is the most common gotcha in Excel-to-SharePoint workflows. When I saw the number 256, I knew it had to be some sort of limitation, since 256 is 2^8.

The fix: open the action's Settings, enable Pagination, and set the threshold. Your threshold value can be from 0 to 100,000. But there's a catch — Scheduled flows enforce a 5,000-row pagination limit for List rows present in a table. Instant flows can handle higher counts, but scheduled flows fail validation above 5,000.

The Excel Online connector also has its own limits: source files must be under 25 MB, the connector supports up to 500 columns, and concurrent writes to the same workbook are unsupported. Retry behavior in long-running write scenarios can produce duplicate inserts. (learn.microsoft.com)

For a real-world benchmark: one flow ran for 5 hours 20 minutes, and a second flow ran for 6 hours 30 minutes. Both flows failed but all 100,000 records were imported without issue. Power Automate works at scale, but it's slow and unreliable.

SharePoint Connector Throttling

600 API calls per connection per 60 seconds: this is the primary constraint. If your flow makes more than 600 calls to SharePoint actions within a minute, you'll receive HTTP 429 errors. The count is per connection, not per flow, so multiple flows using the same connection share this quota.

Microsoft Graph API Resource Units

SharePoint provides various APIs. Different APIs have different costs depending on the complexity of the API. The cost of APIs is normalized by SharePoint and expressed by resource units.

The practical rule: you can estimate the request rate using an average of 2 resource units per request, and divide resource unit limits by 2 to get the estimated request rate. A tenant with 0–1,000 licenses gets approximately 5,400,000 daily resource units. Operations like create, update, or delete cost roughly 2 RUs each.

In addition to throttling by user account, limits are also applied to applications in a tenant. Every application has its own limits in a tenant, which are based on the number of licenses purchased per organization.

When your application gets throttled, SharePoint Online returns a HTTP status code 429 ("Too many requests") or 503 ("Server Too Busy") and the requests will fail. In both cases, a Retry-After header is included in the response. If your script ignores that header, your failure rate compounds.

Tip

Microsoft Graph is cloud-born APIs that have the latest improvements and optimizations. In general, Microsoft Graph consumes fewer resources than CSOM and REST to achieve the same functionality. If you're writing custom import scripts, always prefer Graph over legacy CSOM/REST.

Large File Uploads via API

For large binaries, use upload sessions rather than single-request uploads:

POST /me/drive/items/{itemID}:/{item-path}:/createUploadSession

Upload sessions require fragments to be uploaded sequentially, each request must be under 60 MiB, and chunk sizes must be a multiple of 320 KiB. Out-of-order uploads fail. Oversized fragments fail. Non-320-KiB multiples can fail at commit time. (learn.microsoft.com)

Why Data Mapping and Nested Content Break During Import

CSV Columns → SharePoint List Columns

When you import a CSV into SharePoint, each column header becomes a list column. The type mapping is the first point of failure:

  • Text fields work reliably — single-line text is the default.
  • Numbers get auto-detected, but leading zeros are stripped (ZIP codes, part numbers).
  • Dates assume US format (MM/DD/YYYY). European or ISO dates may be flipped or rejected.
  • Choice / Lookup columns don't exist in flat CSV. Any column that should be a lookup to another list becomes plain text.
  • Multi-value fields (e.g., tags separated by ;) import as a single text string, not as a multi-select choice.
  • Rich text / HTML in a CSV cell is stored as plain text — all formatting is lost.
  • Person/Group and Managed Metadata fields need pre-existing targets. Users must exist in the directory; term sets must exist in the term store.

The Lookup Column Threshold

You can have no more than 12 complex columns (lookup, person/group, managed metadata) in a single view. Each lookup column creates a join with other tables. Microsoft decided to set the limit at 12 to avoid performance degradation. If your source data has deeply relational structures — like a CRM export with 15+ linked entities — you'll need to redesign your list schema before import.

For a deeper look at designing the right target architecture, see Mastering SharePoint Information Architecture 2026: Content Types, Metadata & Lists.

What Gets Preserved vs. What Breaks

Data Element CSV/Excel Import Power Automate Graph API
Plain text values
Column types (auto-detect) Partial (often wrong) Manual mapping Full control
Lookup relationships ❌ (flattened to text) ❌ (requires separate logic) ✅ (with custom code)
Managed metadata Partial
Attachments ✅ (separate action)
Version history
Permissions ❌ (separate API calls)
Created/Modified dates Overwritten to import date Overwritten Can be set via API*

* Setting Created and Modified timestamps via API requires the calling account to have site collection admin rights and using the ValidateUpdateListItem method — the standard PATCH endpoint ignores these fields.

Modern Pages and Rich Content

SharePoint modern pages use a canvasLayout JSON structure. Importing Markdown or HTML directly into a SharePoint page body requires complex parsing; otherwise, it renders as raw text. If your source is Markdown, HTML, Evernote, or Google Docs, SharePoint has no native page-conversion pipeline that preserves sections, web parts, comments, and layout. If all you need is storage, upload files to a library. If you need editable modern pages, you're rebuilding content, not importing it. (learn.microsoft.com)

Attachments are a separate track. Grid view won't handle them. List item attachments have a smaller limit than document-library files. In practice, high-fidelity imports split into at least two streams: rows and metadata on one side, binaries on the other.

Edge Cases That Break Real Imports

Encoding: The Silent Data Destroyer

Users with a simple document library trying to store .csv files found that a number of them would end up being blank once uploaded — both via drag-and-drop and the upload button.

This happens because SharePoint expects UTF-8 encoding, but Excel on Windows defaults to ANSI when saving as CSV. Special characters in German (ä,ö,ü) were not transformed correctly. A PowerShell workaround involved getting the content and saving it in another csv file with encoding to UTF-8, then reading the new file with encoding to unicode.

The fix: Always save CSVs as CSV UTF-8 (Comma delimited) from Excel, or use a text editor to convert encoding before import. For PowerShell scripts, specify -Encoding UTF8 explicitly. Verify in Notepad++ or VS Code before uploading.

Duplicate and Missing Records

SharePoint lists don't enforce unique constraints by default. Import the same CSV twice and you get double the rows. Power Automate flows that fail mid-run and retry can create partial duplicates — the first 3,000 rows import successfully, the flow times out, retries, and imports those 3,000 again before continuing.

Prevention: Add an indexed column with a unique constraint before import. On the API side, implement idempotency checks — query for existing records before creating new ones. Use an external ID and upsert logic, not blind inserts.

Large Dataset Performance

The threshold impacts everything — from the UI to PowerShell to API performance and list configuration. Once you exceed it, a cascade of UX, automation, and governance problems follow. If the list view exceeds 5,000 items, Export to Excel fails, even if you're only exporting a subset.

Practical impact during import:

  • Lists with 20,000+ items slow down noticeably during bulk inserts.
  • Power Automate flows processing 50,000+ rows can take 6+ hours and may time out.
  • API batch operations hitting throttle limits during peak hours get progressively slower as Retry-After intervals compound.

Throttling and performance slowdowns have a higher tendency to occur during peak hours. Off-peak hours are typically nights and weekends in your region's time zone. Schedule large imports accordingly.

Permissions as a Hidden Blocker

SharePoint supports up to 50,000 unique permissions on a list or library, but Microsoft recommends a general limit of 5,000. Migration tools like ShareGate explicitly warn about too many uniquely permissioned items. A migration tool cannot fix a destination model that is already permission-fragmented. (support.microsoft.com)

Comparing SharePoint Migration Tools

When native import methods hit their limits, dedicated migration tools fill the gap.

Microsoft SPMT (SharePoint Migration Tool)

As Microsoft's native tool, it integrates seamlessly with SharePoint Online and OneDrive, supporting content migration from SharePoint Server 2010, 2013, 2016, and 2019, as well as file shares. SPMT preserves file structures, metadata, permissions, and version history during migration.

Where it works: On-prem to online migrations, file share imports. It's free, handles incremental migration, and offers PowerShell capabilities for large-scale enterprise migrations.

Where it breaks: Compared to third-party tools, SPMT lacks some advanced features. No support for non-Microsoft sources (Google Drive, Box, Confluence). Limited remapping capabilities — what comes in is structurally similar to what was there.

ShareGate

ShareGate stands out for its intuitive, user-friendly interface that requires minimal technical expertise. Its drag-and-drop functionality makes it accessible even for beginners while still offering powerful migration capabilities.

Where it works: Tenant-to-tenant migrations, reorganizing site collections, mid-size migrations where the team wants a visual interface and strong reporting. Its pre-check can simulate a migration before copying data, which is genuinely useful.

Where it breaks: It can get pricey for large-scale migrations. Licensing starts around $5,995/year. Cannot function incrementally. Does not migrate at the farm level. Content migrations are performed one list or library at a time, and SharePoint destination limits such as too many unique permissions still apply.

AvePoint Fly

AvePoint's migration tools go beyond SharePoint, enabling built-in versatility. Its SharePoint-friendly features include scanning content before migration to filter, map, and monitor the migration in real time. AvePoint can migrate term stores, site settings, customized views and forms, and non-SharePoint data from sources like HCL Notes and OpenText.

Where it works: Enterprise-scale, multi-source, compliance-heavy environments where you need governance alongside migration.

Where it breaks: The downside is cost and complexity, which can be more than a small or midsized team needs for a simple project.

Custom Scripts (Graph API / PowerShell)

Where they work: When you need full control — custom field mapping, conditional logic, handling edge cases programmatically, setting historical timestamps, managing attachments and metadata in a single pipeline.

A basic Graph API list-item create call is straightforward:

POST /sites/{site-id}/lists/{list-id}/items
Content-Type: application/json
 
{
  "fields": {
    "Title": "Widget",
    "Color": "Purple",
    "Weight": 32
  }
}

That simplicity is real for flat fields, but it hides the hard part: lookups, people, taxonomy, attachments, and page reconstruction all need extra logic. Graph exposes document-library entries as both listItem and driveItem, which is why binary files and metadata often have to be migrated in separate steps.

Where they break: They require engineering time to build, test, and maintain. Throttle handling, error recovery, pagination logic, and idempotency all fall on your team. For a deeper look at why DIY scripts fail at scale, see Why DIY AI Scripts Fail and How to Engineer Accountability.

Warning

Migration tools reduce manual work. They do not remove SharePoint's own limits. If the destination model has unsupported field types, too many unique permissions, or a structure mismatch, the tool still needs a workaround.

When Standard Tools Fail

Native import and off-the-shelf tools work for straightforward scenarios — file share to document library, on-prem list to online list, small CSV to new list. They start failing when:

  • Relational structures must be preserved. Your source has interlinked tables with lookup columns, calculated fields, and managed metadata taxonomies. A CSV import flattens all of this.
  • Historical metadata matters. Created/Modified dates, original authors, and version history can't be set through normal import paths.
  • Scale exceeds tool tolerances. 500,000 list items with attachments, requiring batched API calls that respect throttle limits over days of runtime.
  • Multiple content types coexist. A single list has 4 content types with different column schemas. CSV import creates one flat schema.
  • Data transformation is required mid-flight. Values need reformatting, columns need splitting or merging, lookups need resolving against reference data.
  • Modern pages need to be rebuilt, not stored as files. Uploaded HTML or Markdown doesn't become editable SharePoint content.

This is where ClonePartner operates. We write custom migration scripts that handle API pagination, respect SharePoint's resource unit quotas, implement automatic Retry-After backoff, and map complex relational structures into SharePoint's type system — including setting historical timestamps via ValidateUpdateListItem and preserving attachment hierarchies.

The difference from DIY scripting: we've already solved the edge cases. We know that concurrent uploads sharing a connection pool hit the 600-call/minute connector limit before the tenant-level RU limit. We know that managed metadata term store imports require pre-provisioning terms before items can reference them. We know that ValidateUpdateListItem is the only reliable way to set Created and Modified fields. These aren't in any migration tool's checkbox — they're engineering decisions made per-project.

Pre-Migration Checklist for SharePoint Data

Before you touch the import button:

1. Classify Your Data

Decide what belongs in a SharePoint list, what belongs in a document library, what needs modern pages, and what should live in the term store. Don't dump everything into one library and promise yourself you'll fix it later. See Mastering SharePoint Information Architecture for guidance on structuring this correctly.

2. Clean Your Source Data

  • Remove empty rows and columns. SharePoint creates blank list items for empty rows in a CSV.
  • Standardize date formats. Use ISO 8601 (YYYY-MM-DD) or explicitly match your SharePoint site's regional settings.
  • Strip leading/trailing whitespace. Especially in columns that will become lookup keys.
  • Deduplicate. SharePoint won't do this for you.
  • Create a stable external ID. Avoid guessing which column becomes the durable key.

3. Choose the Right Encoding

Always export as UTF-8 with BOM if your data contains non-ASCII characters. In Excel: File → Save As → CSV UTF-8 (Comma delimited). Verify in a text editor before uploading.

4. Design Your Target List Schema First

Don't let SharePoint auto-create columns from CSV headers. Instead:

  • Create the list manually with the correct column types (Choice, Lookup, Date, Managed Metadata).
  • Index columns you'll filter on before the list exceeds 5,000 items. Automatic index creation when sorting in the modern experience is limited to lists and libraries with less than 20,000 items.
  • Seed lookup lists, term sets, and user references before importing child records.

5. Plan for the 5,000-Item Threshold

You can't have a list view with more than 5,000 items, but you can have a list or library with sub-folders containing less than 5,000 items each and total items of the list can go up to 30 million.

If your import exceeds 5,000 rows, create filtered views and indexed columns before importing. This prevents the "operation is prohibited" error from locking out access to the list.

6. Run a Pilot Import

Import 50–100 rows first. Verify column types, date formatting, special character rendering, and lookup resolution. Include difficult records — attachments, metadata, and lookup values — so you catch mapping issues early. Fix problems before running the full import.

Understanding how data comes out helps you understand how it should go in — review how to export data from SharePoint to validate your structure is usable long-term.

Real-World Migration Scenarios

Scenario 1: 20,000-Row Spreadsheet → SharePoint List

Source: An operations team with a 20,000-row Excel tracker — product IDs, statuses, assigned owners, date ranges.

Wrong approach: Import via the "From Excel" option. This creates a new list with auto-detected types, hits the 5,000-row view threshold immediately, and likely misdetects the product ID column as Number (stripping leading zeros).

Right approach:

  1. Create the SharePoint list manually with correct column types (Product ID as Text, Owner as Person, Status as Choice).
  2. Index the Status and Owner columns.
  3. Create filtered views (e.g., by Status) that each return <5,000 items.
  4. Use Power Automate with pagination enabled (threshold: 20,000) to read from the Excel file stored in OneDrive/SharePoint and write row-by-row to the list.
  5. Expect 2–3 hours for the flow to complete. Monitor for HTTP 429 retries.

Scenario 2: CRM Export → SharePoint List (Preserving Relations)

A financial services firm exports 50,000 accounts from a legacy CRM. Each account has linked contacts, activities, and document references. A flat CSV loses all relationships.

The approach: Export each entity type separately. Create a SharePoint list for Accounts, one for Contacts, one for Activities. Import Accounts first (establishing internal IDs), then import Contacts with a Lookup column pointing to the Accounts list. This requires API-level imports — the Graph API POST /sites/{site-id}/lists/{list-id}/items endpoint lets you set lookup field values by referencing existing item IDs.

This kind of relational preservation is where the gap between "import a CSV" and "migrate a system" becomes clearest. If you're mapping databases to SharePoint lists, our guide on Mapping Notion Databases to SharePoint Lists covers the structural decisions in detail.

Scenario 3: Wiki or Knowledge Base → SharePoint

Migrating structured wiki content from Confluence or similar tools into SharePoint means splitting pages (rich content) from structured data (tables, metadata). Source macros and embedded tables don't translate to SharePoint web parts or list constructs.

If the goal is just archiving, upload files into a library and stop there. If the goal is a usable modern intranet, uploaded files aren't enough — modern pages need to be recreated as sitePage objects with layout and web-part structure. That's a transform project, not a file import.

For a detailed breakdown of this scenario, see Confluence to SharePoint Migration: Methods, Limits & Macro Mapping.

Making the Right Choice

The best import method depends on three variables: data volume, structural complexity, and required fidelity.

  • Under 5,000 rows, flat structure, no relationships: Native CSV or Excel import. Done in minutes.
  • 5,000–50,000 rows, simple columns, recurring syncs: Power Automate with pagination. Budget 2–6 hours per run.
  • On-prem SharePoint to Online, standard content types: SPMT. Free and purpose-built.
  • Tenant-to-tenant, reorganization needed: ShareGate or AvePoint. Budget $5K–$50K+ depending on scope.
  • Complex relations, historical data, custom metadata, 100K+ items: Custom API scripts or a managed migration service.

No single tool handles every case. Match the tool to the complexity. If you're unsure which bucket your migration falls into, or if you've already hit throttling limits and encoding issues trying to do it yourself — that's exactly the kind of problem we solve.

Frequently Asked Questions

What is the SharePoint list view threshold and can I change it?
The list view threshold is 5,000 items per view in SharePoint Online. It cannot be changed — it's fixed by Microsoft to protect multi-tenant performance. Lists can still store up to 30 million items, but you must use indexed columns and filtered views to stay under 5,000 items per view.
Why does Power Automate only import 256 rows from Excel?
The 'List rows present in a table' action defaults to 256 rows. To fix this, open the action's Settings, enable Pagination, and set the threshold (up to 100,000 for instant flows, 5,000 for scheduled flows). Without this setting, Power Automate silently stops at 256.
What is the maximum file size I can upload to SharePoint?
Via browser or OneDrive sync, the limit is 250 GB per file. Via REST API or CSOM in a single request, the limit drops to 250 MB — you must use file chunking for anything larger. List item attachments are capped at 250 MB.
Why does my CSV upload to SharePoint show blank data?
This is almost always an encoding issue. SharePoint expects UTF-8, but Excel on Windows defaults to ANSI encoding when saving CSVs. Re-save as 'CSV UTF-8 (Comma delimited)' from Excel, or convert the encoding using a text editor before uploading.
Does SharePoint import Markdown or HTML as modern pages?
No. You can upload those files to a document library, but modern page content is represented separately as sitePage objects with canvasLayout and needs transform or rebuild work. Uploading files does not create editable SharePoint pages.

More from our Blog

The Ultimate Guide to Mapping Notion Databases to SharePoint Lists
Notion/SharePoint

The Ultimate Guide to Mapping Notion Databases to SharePoint Lists

This engineer-led guide details exactly how to map Notion databases to SharePoint lists while preserving your critical relationships. Learn the 4-step architecture for converting Columns to Types, Relations to Lookups, and using SharePoint 'Dynamic Filtering' to recreate the Notion dashboard experience—ensuring zero data loss during your migration.

Raaj Raaj · · 8 min read