Crawling Solutions
Crawling is the foundation of Cartographer. It discovers every component in a Dataverse solution and catalogs it for documentation, AI summaries, and analysis.
What Crawling Does
When you crawl a solution, Cartographer queries the Dataverse metadata APIs to discover every component that belongs to that solution. It reads the raw metadata for each component — table definitions, column schemas, form XML, view FetchXML, plugin step registrations, cloud flow definitions, and more — and stores a normalized record in the Component table.
Crawling is solution-scoped, meaning you choose exactly which solutions to crawl. This is not a full-environment blast — you control which solutions are documented and which are ignored.
Incremental Crawling
After the initial crawl, subsequent crawls are incremental — Cartographer only re-processes components that have been added, modified, or removed since the last crawl. This makes re-crawls fast and efficient.
Adding a Solution to Crawl
- 1
Open Solution Manager
In the Cartographer app, navigate to Solution Manager in the left sidebar.
- 2
Click Add Solution
Click the Add Solution button in the command bar. A panel will appear with a dropdown of all solutions in your environment.
- 3
Select a solution
Choose the solution you want to crawl from the dropdown. You can search by solution name. Only managed and unmanaged solutions are shown — default solutions (like the Common Data Service Default Solution) are excluded.
- 4
Configure crawl settings
Optionally configure the crawl frequency (manual, daily, weekly) and any component type filters. By default, all component types are included.
- 5
Save and Crawl
Click Save to add the solution. Then click Crawl Now to start the initial crawl. The progress indicator will show how many components are being discovered.
Component Types Discovered
Cartographer discovers and catalogs 33 distinct component types. Each type is processed differently to extract the most useful metadata.
Data Model
- Tables (Entities)
- Columns (Attributes)
- Relationships
- Global Option Sets
- Keys
User Interface
- Model-Driven App Forms
- Views (SavedQuery)
- Charts
- Dashboards (SystemForm)
- Sitemaps
- Command Bars (RibbonCustomization)
Automation
- Plugin Assemblies
- Plugin Steps (SDK Message Processing)
- Cloud Flows (Automated)
- Cloud Flows (Scheduled)
- Cloud Flows (Instant)
- Business Rules
- Classic Workflows
- Actions
Web Resources
- JavaScript
- HTML
- CSS
- Images (PNG/JPG/SVG)
- XML Data
- Resx (Localization)
Security
- Security Roles
- Field Security Profiles
- Connection Roles
Other
- Environment Variables
- Custom Connectors
- Canvas Apps
- Custom Controls (PCF)
- Templates
- Web Hooks
Crawl Frequency
You can configure how often Cartographer re-crawls each solution to keep documentation up to date.
| Frequency | Description | Availability |
|---|---|---|
| Manual | Crawl only when you click "Crawl Now" | All plans |
| Daily | Automatically re-crawls once every 24 hours | Professional+ |
| Weekly | Automatically re-crawls once every 7 days | Professional+ |
Component Limits by Plan
The total number of components you can crawl depends on your plan tier. This is a total across all crawled solutions combined.
Free
20 components
Great for exploring a single small solution
Professional
200 components
Covers most production environments
Enterprise
Unlimited
For complex, multi-solution organizations
Re-crawling and Removing Solutions
Re-crawling
Click Crawl Now on any solution to trigger an immediate re-crawl. Incremental crawling means only changed components are reprocessed. Use Full Re-crawl if you want to discard all cached data and start fresh.
Removing a solution
To stop crawling a solution, select it in Solution Manager and click Remove. This deletes the crawl configuration and all associated component records from the Cartographer database. It does not affect the actual solution in your environment.
Crawl Log
Every crawl operation is recorded in the Crawl Log. You can view it from the Solution Manager to see:
- Start and end timestamps
- Number of components discovered, updated, and removed
- Crawl duration
- Any errors or warnings encountered
- Whether the crawl was manual or scheduled
Next Steps
Once your solution is crawled, generate AI summaries for your components.