Select a category on the left, to get your answers quickly
Absolutely classic Solr blunder! This one comes up often when people get a little too creative with their schema changes. Let’s break it down, old-school style.
You’re seeing this error:
cannot change field “tcngramm_X3b_cn_field_content_title_1” from index options=DOCS_AND_FREQS_AND_POSITIONS to inconsistent index options=DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS
This is a Solr schema error. In short, Solr stores index “options” for fields—how much information (positions, offsets, etc.) is tracked for each field. Once a field is indexed with a certain setting, you can’t later try to index with different options without reindexing or fixing the schema.
tcngramm_X3b_cn_field_content_title_1
was defined/indexed with:indexOptions=DOCS_AND_FREQS_AND_POSITIONS
indexOptions=DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS
Solr sees this and says:
“You can’t change a field’s index options on the fly! Not in my house!”
schema.xml
or managed-schema
).termVectors
, storeOffsetsWithPositions
, and indexOptions
for this field.Example:
If you’re using storeOffsetsWithPositions="true"
, you need:
<field name="tcngramm_X3b_cn_field_content_title_1" type="text_general" indexed="true" stored="true" storeOffsetsWithPositions="true"/>
Solr doesn’t retroactively change old docs. After changing the schema, delete and reindex all docs for this core/collection.
Traditional steps: 1. Update schema/field type as needed 2. Reload Solr core/collection 3. Wipe the index (delete all docs) 4. Reindex from Drupal/Search API
storeOffsetsWithPositions=true
automatically. Check those module settings!Step | What to do |
---|---|
1. Check field definition | In Solr, ensure index options are consistent |
2. Update as needed | Change to add offsets if required |
3. Reload schema | Reload Solr core/collection |
4. Wipe & reindex | Delete all docs and reindex |
5. Check Drupal configs | Make sure module options match Solr setup |
Once a field’s been indexed a certain way, all documents forever after must use the same index options (unless you reindex everything after changing the schema). It’s like family traditions—break one, and there’s chaos at Thanksgiving.
Contact Us about your issue and we’ll look deeper into it.
Make sure you contact us (or email [email protected]) in order to have remoteStreaming enabled for your Opensolr index. Once we’ve enabled that, you can go ahead and make sure the below stuff is setup correctly in your index. This should work for any index, not just for use with Drupal.
You’re integrating Drupal 10 with OpenSolr and encountering this error:
Remote Streaming is disabled.
—even after setting enableRemoteStreaming="true"
inside <requestDispatcher>
.
OpenSolr does not rely solely on the <requestDispatcher>
setting. While setting this is required, it’s not sufficient.
Here’s the complete step-by-step to get remote PDF indexing via stream.url
working:
In your solrconfig.xml
:
<requestDispatcher>
, this must exist:<requestDispatcher handleSelect="true">
<requestParsers enableRemoteStreaming="true" multipartUploadLimitInKB="2048000" />
</requestDispatcher>
Look for this section (or add it if it doesn’t exist):
<requestHandler name="/update/extract" class="solr.extraction.ExtractingRequestHandler">
<lst name="initParams">
<bool name="allowRemoteStreaming">true</bool>
</lst>
</requestHandler>
✅ This is the part that is usually forgotten. Without this, Solr will silently reject stream.url
requests with that Remote Streaming is disabled
error.
Make sure to:
After updating and re-uploading:
59a6fb53f5a58b06f113c1badef132cd
)You can test remote PDF indexing using:
curl "https://host.opensolr.com/solr/<INDEX_NAME>/update/extract?stream.url=https://example.com/sample.pdf&literal.id=test-doc&commit=true"
You should see a 200 OK and no longer get the Remote Streaming is disabled
error.
There isn’t a single perfect guide, but here are the closest:
Or reach out to [email protected]
If Solr is throwing the following error:
java.lang.IllegalArgumentException: cannot change DocValues type from <type> to <other_type> for field <fieldName>
The Solr Schema explains DocValues in this way:
DocValues is recommended (required, if you are using *Point fields) for faceting, grouping, sorting and function queries. DocValues will make the index faster to load, more NRT-friendly and more memory-efficient. [DocValues are] currently only supported by StrField, UUIDField, all *PointFields, and depending on the field type, they might require the field to be single-valued, be required or have a default value.
DocValues is set in many schema fields similar to this one:
<field name="date" type="pdate" docValues="true" indexed="true" stored="true" multiValued="false" />
The DocValues type is determined internally by Solr in the following way:
DocValues are only available for specific field types. The types chosen determine the underlying Lucene docValue type that will be used. The available Solr field types are:
StrField and UUIDField.
If the field is single-valued (i.e., multi-valued is false), Lucene will use the SORTED type.
If the field is multi-valued, Lucene will use the SORTED_SET type.
Any Trie* numeric fields, date fields and EnumField.
If the field is single-valued (i.e., multi-valued is false), Lucene will use the NUMERIC type.
If the field is multi-valued, Lucene will use the SORTED_SET type.
Boolean fields
Int|Long|Float|Double|Date PointField
If the field is single-valued (i.e., multi-valued is false), Lucene will use the NUMERIC type.
If the field is multi-valued, Lucene will use the SORTED_NUMERIC type.
From the above, you can see that Solr’s complaint that it “cannot change a DocValues type” implies that someone has changed the definition of the named field, probably by altering the DocValues setting or by changing the multivalue setting.
Possible Solutions:
A field change requires you to reset your index (remove all index data) from your Opensolr Index Control Panel and Re-index the data. Otherwise, the conflict between schema and index will cause many downstream error messages.
If that fails, try creating a new index, and Re-index the data in the new index. (This requires setting up your Drupal Server connection information).
When indexing data in Solr, you may encounter the following error:
ERROR (qtp123456-78) [<replica>] o.a.s.h.RequestHandlerBase org.apache.solr.common.SolrException: Invalid Number: <bad value> for field <numeric field>
This message means Solr attempted to index a value into a numeric field (such as int
, float
, double
, or long
), but the provided value wasn’t a valid number. When this happens, Solr will drop the entire document from the index—so this error is no joke!
This error occurs if:
"foo"
instead of 42
).schema.xml
, but your code sends a string.""
, "NaN"
, "NULL"
, "abc"
)schema.xml
.Example: For a field <field name="price" type="pfloat" .../>
, make sure you’re sending a valid float (e.g., 12.99
).
Validate Before Sending:
Reject or sanitize bad data before indexing.
Update Your Schema (If Needed):
schema.xml
accordingly.Otherwise, fix the source of the non-numeric data.
Handle Legacy or Unknown Data:
0
or omit the field, depending on your business logic.Sample Error Log:
org.apache.solr.common.SolrException: Invalid Number: "foo" for field price
Diagnosis:
- Field: price
(expected: float)
- Value received: "foo"
(invalid)
Resolution:
Update your application logic to ensure only numeric values are sent for the price
field.
Don’t sweat it—sometimes even the best code lets a gremlin slip through.
If you keep running into this issue or can’t locate the source, contact OPENSOLR support for expert help.
Heads up, Solr wranglers! We’ve moved the maxBooleanClauses
setting into your solr.xml
(in your Solr home directory alongside your cores/collections).
<solr>
<int name="maxBooleanClauses">90589</int> <!-- 🛠️ Bump your limit here -->
<shardHandlerFactory name="shardHandlerFactory"
class="HttpShardHandlerFactory">
<int name="socketTimeout">${socketTimeout:600000}</int>
<int name="connTimeout">${connTimeout:60000}</int>
</shardHandlerFactory>
</solr>
org.apache.solr.common.SolrException: Too many boolean clauses (1025).
*
or leading wildcards (*term
) expand to many terms. Increase maxBooleanClauses
Adjust in solr.xml
(as shown above).
Trim your word lists
synonyms.txt
stopwords.txt
protwords.txt
Remove rarely used terms to reduce clause count.
Optimize filter chain
SynonymGraphFilterFactory
, not at index time. Place stop and prot word filters before wildcard or graph filters.
Avoid expansive wildcards
term*
) over leading wildcards. Keep calm and Solr on! 😎🖋️
For any questions that look like those below:
Why am I not getting results for query A, in AJAX, but I am getting results for query A, without AJAX?
Why am I not getting results for query A?
Why am I not getting results for query B?
For all of the above questions, you should refer to the online Solr documentation, or, the Drupal Community (if using Drupal).
Opensolr, provides the Solr as a Service platform.
Solr search results are not under the responsability of Opensolr, but rather, the way queries will work, depends solely on your Solr implementation, or the implementation of the CMS system you are using.
Please be advised that, your Opensolr Index may fail to reload, when using AnalyzingInfixSuggester
It turns out, that Drupal, is exporting the Solr Configuration zip archive erroneously.
Basically, you will need to manually edit solrconfig_extra.xml, in order to explicitly specify a separate folder for each suggester dictionary.
You can click here to learn more, from the Bug reported to the Drupal Community.
UPDATE: As of Feb 08 2023, the new Opensolr Config Files Uploader, should take care of these dependencies automatically, so the steps below should not be necessary.
However, if you still run into issues, you can try the steps below:
There is often the case (as it is with drupal), that your config files will contain files like schema_extra.xml, or solrconfig_extra.xml
In this case, your main schema.xml and solrconfig.xml will contain references to various fields and types that are defined in those extra files.
Therefore, you need to split your config files archive into multiple archives, and upload them as follows:
- First upload the extra files (zip up the schema_extra.xml and other *_extra.xml files and upload that zip first)
- Second upload the main schema.xml file, along with all other resource files, such as stopwords.txt, synonyms.txt, etc.
- Third, upload a zip archive that contains solrconfig.xml alone.
The error message:
org.apache.solr.common.SolrException: Undefined field _text_
means that Solr received a query, filter, or request that references a field named _text_
, but this field is not defined in your Solr schema.
This typically happens when:
schema.xml
does not declare a <field>
or <dynamicField>
for _text_
.df=_text_
), but does not exist.example
core) elsewhere, but your current core/schema does not have _text_
._text_
._text_
was common in old examples, but new setups may not include it._text_
in Solr?_text_
is a conventional field name, not a built-in Solr field._text_
is a catch-all field that copies the content of multiple other fields (using <copyField>
), to make full-text searching easier.If your schema doesn’t define _text_
, and a query refers to it, you’ll get this error.
Example query that fails:
http://localhost:8983/solr/mycore/select?q=solr&df=_text_
Solr error:
org.apache.solr.common.SolrException: Undefined field _text_
_text_
Field in schema.xml
Add to your <fields>
section:
<field name="_text_" type="text_general" indexed="true" stored="false" multiValued="true"/>
type="text_general"
uses Solr’s default full-text analysis (use your appropriate type).multiValued="true"
allows multiple fields to be copied into _text_
.stored="false"
saves space if you only need it for searching.<copyField>
to Populate _text_
Add to your schema.xml
:
<!-- Example: copy title, description, and content into _text_ -->
<copyField source="title" dest="_text_"/>
<copyField source="description" dest="_text_"/>
<copyField source="content" dest="_text_"/>
schema.xml
Snippet<schema name="example" version="1.6">
<field name="id" type="string" indexed="true" stored="true" required="true"/>
<field name="title" type="text_general" indexed="true" stored="true"/>
<field name="content" type="text_general" indexed="true" stored="true"/>
<field name="_text_" type="text_general" indexed="true" stored="false" multiValued="true"/>
<copyField source="title" dest="_text_"/>
<copyField source="content" dest="_text_"/>
</schema>
_text_
_text_
as default field:http://localhost:8983/solr/mycore/select?q=solr&df=_text_
http://localhost:8983/solr/mycore/select?q=_text_:solr
_text_
:_text_
in your query, config, and client code.df
) to an existing field (e.g., title
or content
).Cause | Solution |
---|---|
_text_ not defined |
Add <field> for _text_ in schema.xml |
_text_ not populated |
Add <copyField> for sources to _text_ |
Query refers to _text_ |
Update query to use an existing field, or define _text_ |
Migrated core/config | Add _text_ back, or adjust queries/configs to not use it |
<field name="_text_".../>
<copyField ... dest="_text_"/>
entriesq=...&_df=_text_
or similarThe “Undefined field text” error means you’re referencing a field that isn’t defined or populated.
Restore _text_
with <field>
and <copyField>
, or update your queries/configs to not depend on it.
May your schemas be valid and your queries swift!
When your Solr-powered app says “the file’s too big” or “too many boolean clauses!”—don’t panic. Here’s how to tweak your solrconfig.xml
so you can upload more, search harder, and (almost) never hit a wall.
The key setting: formdataUploadLimitInKB
in your solrconfig.xml
file.
Open your Opensolr Index Control Panel:
https://opensolr.com/admin/solr_manager/tools/INDEX_NAME
Navigate to the Config Files Editor tab.
Select solrconfig.xml
.
Look for the <requestDispatcher>
directive.
Example:
xml
<requestDispatcher handleSelect="false"
multipartUploadLimitInKB="2048000"
formdataUploadLimitInKB="2048" />
multipartUploadLimitInKB
– For files uploaded via multipart POST.formdataUploadLimitInKB
– For files uploaded as form data.
Increase the value as needed.
Warning: Don’t make it TOO big—your JVM heap and security folks will not thank you.
No matter how big you set those upload limits, if your query contains too many boolean clauses (lots of OR
or AND
), Solr will throw a “too many boolean clauses” error.
Solution:
Increase maxBooleanClauses
in your solrconfig.xml
:
<maxBooleanClauses>2048</maxBooleanClauses>
If you’re hitting the limit even after increasing it, consider splitting your mega-query into several smaller queries and combining results in your app (think of it as Solr-powered pagination, but for logic!).
https://opensolr-server.solrcluster.com/solr/production/select?q=*:*&fq=field_name:(VAL1 OR ... OR VALn)
https://opensolr-server.solrcluster.com/solr/production/select?q=*:*&fq=field_name:(VALn+1 OR ... OR VALm)
Where n
and m
are each less than your maxBooleanClauses
setting.
If you’re running Opensolr with a Resilient Cluster Solution, you get: - Load balancing for multiple simultaneous queries - Better resiliency - Much faster response times!
Pro Tip:
Remember to always back up your config before making changes. Solr has a sense of humor, but only if you do, too. 😄
Sometimes, in the shared opensolr cloud, the data folder may get corrupted, so it can't be read from or written into.
One easy fix for this, is to simply remove your index, and then just create another one, preferably under another name.
If that doesn't work, please contact us, and we'll be happy to fix it up for you.
Also, keep in mind, that there may be more reasons, so please make sure to check your error log, by clicking the Error Log button inside your opensolr index control panel, and keep refreshing that page to make sure the errors you'll see are accurate.
If you do see errors in there, please email them to us, at [email protected] and we'll fix it for you.
If you usually get an error, such as: Unknown field... Or Missing field, and your schema.xml already contains those fields, make sure you disable the Schemaless mode in solrconfig.xml
Just head on to the Config Files Editor in your opensolr index control panel, and locate a snippet that looks like this:
class="ManagedIndexSchemaFactory"
According to the solr documentation, you can disable the ManagedIndexSchemaFactory as per the instructions below:
To disable dynamic schema REST APIs, use the following for: <schemaFactory class="ClassicIndexSchemaFactory"/>
Also do not forget to remove the entire snippet regarding the ManagedIndexSchemaFactory, so that you won't accidentally use both.
Yes, you heard right: Opensolr now supports the JTS Topology Suite! And we do it with the kind of grace you’d expect from a platform that respects tradition while embracing progress.
The JTS Topology Suite is like the Swiss Army Knife for geometric operations in Java. It’s powerful, precise, and (let’s be honest) something you never knew you needed until you did. While it doesn’t come bundled with vanilla Solr, we’ve gone ahead and taken care of that for you.
In most Opensolr servers and data centers, JTS support is switched on by default. If you find that your index seems blissfully unaware of this geometric upgrade, don’t panic—simply contact our support team. We’ll flip the switch faster than you can say “topology.”
Nope! Once enabled, no extra setup is required on your part. That means less tinkering and more time for what really matters—like reading classic Solr documentation for fun.
“Why reinvent the wheel when you can roll with Opensolr and JTS?”
Happy searching!