Errors

Opensolr Errors — find answers to your questions

Drupal - Can not get additional data from Solr timeout after...

The Problem

Your Drupal site shows this warning:

Solr was unable to retrieve additional data about the server.

And in your logs you see cURL error 28 — a timeout while Drupal tries to reach /admin/luke on your Solr server.


What Is Actually Happening

Drupal's search_api_solr module periodically calls Solr's /admin/luke endpoint to discover what fields exist in the index, check capabilities, and verify the server is healthy. The Luke handler is like asking Solr to give you a complete X-ray of the entire index — every field, every term count, everything.

On large indexes, this response can be megabytes of data and take several seconds to generate. If it takes longer than Drupal's PHP timeout, the request fails.

WHY /admin/luke CAUSES TIMEOUTSDrupalasks /admin/lukeLuke HandlerScans ENTIRE index...fields, terms, stats...10+ secondstoo slow!TIMEOUTcURL error 28The Fix: Replace Luke with a fast Ping handlerDrupalasks /admin/lukePing HandlerJust says "I am OK!"200 OKinstant!


The Fix

Replace Solr's heavy Luke handler with a lightweight Ping handler at the same URL. Drupal only needs a "yes, I am alive" response — it does not actually need the full Luke data dump.

Add this to your solrconfig.xml:

<requestHandler name="/admin/luke" class="solr.PingRequestHandler">
  <lst name="defaults">
    <str name="q">*:*</str>
  </lst>
</requestHandler>

Step-by-Step for Opensolr Users

  1. Log in to your Opensolr Dashboard
  2. Open your Index Settings for the affected Opensolr Index
  3. Go to Config Files Editor and edit solrconfig.xml
  4. Find any existing /admin/luke request handler block and replace it with the snippet above
  5. Save your changes — Opensolr reloads the config automatically

Verify It Works

curl -sS -u user:pass -w '%{http_code}
' \
  "https://your-cluster.opensolr.com/solr/your_core/admin/luke?wt=json"

You should get an instant 200 response with a small JSON payload.


Alternative: Skip Schema Check in Drupal

If you cannot modify Solr configs, you can tell Drupal to stop calling Luke entirely:

// settings.php
$settings['search_api_solr']['skip_schema_check'] = TRUE;
$settings['search_api_solr']['timeout'] = 3;
$settings['search_api_solr']['connect_timeout'] = 1;

This disables schema introspection. Search and indexing continue to work normally.


Trade-offs

Approach Pros Cons
Replace Luke with Ping (recommended) Instant response, Drupal is happy Luke introspection disabled at that URL
Skip schema check in Drupal No Solr config changes needed Drupal cannot detect schema mismatches
Increase PHP timeout No changes to Solr or Drupal Still slow, just delays the problem

Quick Checklist

  • Replace /admin/luke handler with PingRequestHandler in solrconfig.xml
  • Save and let Opensolr reload the config
  • Verify with a quick cURL call that /admin/luke responds instantly
  • Optionally clear Drupal caches with drush cr
  • If you cannot edit Solr configs, use skip_schema_check in Drupal settings.php

Need help fixing Drupal timeouts? Reach out to us at support@opensolr.com — we can adjust your Solr config directly.

Read Full Answer

Cannot change field from DOCS_AND_FREQS_AND_POSITIONS DOCS_A...

The Error

You reload your Opensolr Index or try to index documents and Solr throws:

cannot change field "field_name" from index options=DOCS_AND_FREQS_AND_POSITIONS
to inconsistent index options=DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS

This means the field was originally indexed with one set of "index options," and now your schema (or application) is trying to use a different set. Solr will not allow mixing two different formats in the same index.


What Are Index Options? (The Simple Version)

When Solr indexes a word in a field, it can store different levels of detail about that word:

LEVELS OF DETAIL IN INDEX OPTIONSDOCS"Which documentscontain this word?"+ FREQS"How many timesdoes it appear?"+ POSITIONS"Where in the textis each occurrence?"+ OFFSETS"Exact characterstart and end?"Basic searchRelevancy scoringPhrase queriesFast highlightingOld documents: POSITIONS New documents: POSITIONS + OFFSETSSolr cannot mix these two formats in the same index segmentThe fix: make the field definition consistent and re-index everything.


Why This Happens

Schema Change Without Re-indexing

You (or a module like Drupal Search API) changed the field definition — for example, adding storeOffsetsWithPositions="true" for faster highlighting. The new definition requires OFFSETS, but old documents were indexed without them. Solr cannot mix both.

Drupal Search API Solr

This is especially common with Drupal. The Search API Solr module may set storeOffsetsWithPositions="true" on certain fields automatically when you enable highlighting features. The module generates the schema, and if the generated schema differs from what was previously indexed, you get this error.


How to Fix It

Step 1: Decide Which Index Options You Want

  • DOCS_AND_FREQS_AND_POSITIONS — standard, good for most use cases
  • DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS — needed for fast highlighting (term vectors with offsets)

Step 2: Update Your Schema

Make the field definition consistent. If you want offsets for highlighting:

<field name="your_field" type="text_general" indexed="true"
       stored="true" storeOffsetsWithPositions="true"/>

If you do not need offsets, make sure storeOffsetsWithPositions is false or absent.

Step 3: Reset and Re-Index

After changing the schema, you must clear and rebuild the entire index:

  1. Reset your index from the Opensolr control panel (remove all data)
  2. Reload the Opensolr Index
  3. Re-index all documents from your application

There is no shortcut — old documents with the old format must be replaced.


Quick Reference

Step Action
1. Check field definition Look for storeOffsetsWithPositions in schema
2. Make it consistent Either add or remove offsets — pick one
3. Reload schema Click Reload in Opensolr control panel
4. Wipe the index Reset/delete all indexed documents
5. Re-index Rebuild from your application or CMS
6. Check Drupal settings If using Drupal, verify Search API module options match

Quick Checklist

  • Check your schema for storeOffsetsWithPositions on the affected field
  • Make the field definition consistent — old and new must match
  • Reset your index and re-index all data after the schema change
  • If using Drupal, check the Search API Solr module highlighting settings
  • Always reload the Opensolr Index after schema changes

Need help resolving an index options conflict? Reach out to us at support@opensolr.com — we will help you get your schema sorted.

Read Full Answer

SolrException: Invalid Number for field

The Error

You are indexing documents and Solr rejects them with:

org.apache.solr.common.SolrException: Invalid Number: "some_value"
for field price

This means you sent a value that is not a valid number into a field that expects a number. Solr cannot turn "hello" into an integer, so it drops the entire document.


What Is Actually Happening

Your schema says a field is numeric (like pint, pfloat, plong, pdouble), but your application is sending something that is not a number — a string, an empty value, "NaN", "NULL", or garbage data.

NUMERIC FIELD TYPE MISMATCHValues Solr Accepts423.14-100099.99Values That Cause the Error"foo""""NaN""NULL""$5"Solr drops the ENTIRE document — not just the bad fieldOne bad value in one field means the whole document is rejected


Common Scenarios

1. Empty or Null Values

Your data pipeline sends "" (empty string) or "null" for a numeric field. Solr cannot parse these as numbers.

2. Currency Symbols or Units

Values like "$19.99", "150kg", or "100%" contain non-numeric characters. Strip those before indexing.

3. Legacy or Dirty Data

Old records from a database migration, CSV import, or third-party API may contain unexpected values like "N/A", "TBD", or "unknown" in what should be numeric fields.

4. Schema Mismatch

The field was recently changed from text_general to pfloat in the schema, but old data still exists in the index, or the application was not updated to send proper numbers.


How to Fix It

Fix 1: Clean Your Data Before Indexing

Validate every value before sending it to Solr. If a field is numeric, make sure the value is actually a number:

// PHP example
$price = is_numeric($rawPrice) ? (float)$rawPrice : null;
if ($price !== null) {
    $doc['price'] = $price;
}
// Do not send the field at all if it is not a valid number
# Python example
try:
    doc['price'] = float(raw_price)
except (ValueError, TypeError):
    pass  # skip the field entirely

Fix 2: Change the Schema to Match Your Data

If the field genuinely receives mixed data (sometimes numbers, sometimes text), change the field type in your schema.xml to text_general or string instead of a numeric type.

<!-- If the field can contain non-numeric values -->
<field name="price" type="string" indexed="true" stored="true"/>

Note: this means you lose numeric sorting and range queries on that field.

Important: After changing a field type, you must reset your index data — delete all existing documents and re-index everything from scratch. The old indexed data was built with the previous field type and is incompatible. In your Opensolr Control Panel: upload the updated schema.xml, click Reload, then click Reset Data, and re-index all your content.

Fix 3: Handle Empty Values

If your application sometimes sends empty values, either:

  • Do not send the field when the value is empty
  • Set a default value in your schema: <field name="price" type="pfloat" default="0" .../>

How to Find the Bad Data

Check your Error Log in the Opensolr control panel. The error message tells you exactly:

  • Which field received the bad value
  • What the bad value was (in quotes)
  • Which document was being processed (if the ID is included)

Look for lines like:

SolrException: Invalid Number: "foo" for field price

Then search your source data for that value and fix it at the source.


Quick Checklist

  • Check the Error Log — it shows the exact field and bad value
  • Validate data before sending to Solr — is_numeric(), float(), parseInt()
  • Remove currency symbols, units, and non-numeric characters before indexing
  • Handle empty/null values — either skip the field or set a schema default
  • If the field truly needs mixed data, change the schema type to string

Still seeing Invalid Number errors? Reach out to us at support@opensolr.com with the error details — we will help you track down the bad data.

Read Full Answer

Loading more articles...