Troubleshooting PI System Crawler Failures: Errors, Root Causes, and Modern Alternatives
Explore the common errors and root causes behind OSIsoft PI Search Crawler failures, learn about security and configuration best practices, and see why upgrading to PI Vision 2020 or later is often the best solution.
Roshan Soni
Troubleshooting PI System Crawler Failures: Errors, Root Causes, and Modern Alternatives
If you've ever been responsible for maintaining OSIsoft PI Data Archives and their associated search capabilities, you might have encountered perplexing errors during crawl operations. These errors can obstruct your ability to search PI data efficiently and often require prompt troubleshooting. In this article, we'll share best practices for diagnosing PI System crawler failures, explore common root causes, and discuss whether it's time to adopt modern alternatives to the legacy Crawler.
Common Symptoms of Crawler Failure
A recent real-world case illustrates typical issues:
- Error:
Cannot upload search items. Exceptions: An error occurred while sending the request. - Error: During startup, encountered a database in Crawling or Incremental Crawling state. Attempt to rebuild index failed.
- Exception:
System.ArgumentNullException: Value cannot be null. Parameter name: request. - Warning: Crawler access mismatch — the PI Data Archive contains more points than the crawler can access.
In this scenario, a simple restart of the relevant service allowed crawling to resume, but the root cause of the failure was less obvious.
Diagnosing the Issues
Let's break down the main issues:
1. Upload Failures and HTTP Exceptions
The error message indicates the Crawler failed to upload search items, likely due to networking, service unavailability, security misconfiguration, or a temporary issue with the search indexer. This could be caused by:
- Network hiccups between the Crawler and PI Data Archive or Search service
- Authentication or permission problems
- Configuration errors in the Crawler or Search Service
2. Index Rebuild and ArgumentNullException
An ArgumentNullException when attempting to rebuild the index points to improper handling of a failed request or misconfiguration in the code or service settings. It often means the service expected a configuration value, security credential, or endpoint that was missing or unavailable during startup.
3. Security Warnings: Missing Point Access
The warning about the Crawler having access to fewer points than exist in the archive suggests security permissions are not adequately configured. The Crawler's service account must have read access to all PI points that should be indexed for search.
Common Workarounds and the "Reboot/Recrawl" Cycle
Many administrators resort to restarting services or servers and recrawling databases — which may resolve transient problems but do not address underlying configuration or architectural weaknesses. Recrawling the entire Data Archive after every issue is time-consuming and disruptive, especially in large installations.
The Crawler's Limitations — And Why Upgrade
The OSIsoft Search Crawler, while useful, has always struggled with reliability and complex permission management. Community feedback echoes these pain points:
- "Crawler is relatively new, but it's already a legacy product!"
- "Reboot days are nightmare and the only solution is to recrawl the DB."
OSIsoft listened, and in PI Vision 2020 and later, the dependency on the Crawler for search was replaced by direct AFSDK calls. This innovation means:
- Faster, real-time search results
- Simpler configuration with fewer moving parts
- Reduced chance of encountering permission mismatches
Recommendations
- Check and fix security permissions. Ensure the Crawler's service account has robust access to all relevant PI points and assets.
- Review network connectivity and firewalls between the PI Data Archive, Crawler, and Search Service.
- Consider upgrading to PI Vision 2020 or later to retire the Crawler and enjoy more reliable and modern search capabilities.
- Monitor logs regularly for recurring patterns, and automate alerts for crawl failures.
Conclusion
Crawler failures are disruptive, but armed with an understanding of the error messages, permissions model, and available upgrades, you can resolve current issues and future-proof your PI System architecture. If you're still plagued by Crawler instability, the answer may be to move forward with PI Vision's new search paradigm and retire the old Crawler for good.
Are you experiencing crawler issues? Share your tips, solutions, or upgrade stories in the comments below!
Tags
About Roshan Soni
Expert in PI System implementation, industrial automation, and data management. Passionate about helping organizations maximize the value of their process data through innovative solutions and best practices.
No comments yet
Be the first to share your thoughts on this article.
Related Articles
Enhancing PI ProcessBook Trends with Banding and Zones: User Needs, Workarounds, and the Road Ahead
A look at the user demand for trend banding/zoning in OSIsoft PI ProcessBook, current VBA workarounds, UI challenges, and how future PI Vision releases aim to address these visualization needs.
Roshan Soni
Migrating PIAdvCalcFilVal Uptime Calculations from PI DataLink to PI OLEDB
Learn how to translate PI DataLink's PIAdvCalcFilVal advanced calculations—like counting uptime based on conditions—into efficient PI OLEDB SQL queries. Explore three practical approaches using PIAVG, PIINTERP, and PICOunt tables, and get tips for validation and accuracy.
Roshan Soni
Understanding PI Web API WebID Encoding: Can You Generate WebIDs Client-Side?
Curious about how PI Web API generates WebIDs and whether you can encode them client-side using GUIDs or paths? This article explores the encoding mechanisms, current documentation, and best practices for handling WebIDs in your applications.
Roshan Soni