FAQ & API Documentation

Select a category on the left, to get your answers quickly

In this clip we will talk about some of the most common drupal configuration issues, based on the solr configuration files provided by the Drupal search api module.

Please be advised that, your Opensolr Index may fail to reload, when using AnalyzingInfixSuggester

You can click here to learn more.

The error is reproduced mainly for Drupal indexes, when using the folowing suggester:

<searchComponent name="suggester" class="solr.SuggestComponent"
    startup="lazy">
    <lst name="suggester">
      <str name="name">default_infix</str>
      <str name="lookupImpl">AnalyzingInfixLookupFactory</str>
      <str name="indexPath">suggester_infix_data</str>
      <str name="dictionaryImpl">DocumentDictionaryFactory</str>
      <str name="field">autocomplete</str>
      <str name="weightField">weigh_dummy</str>
      <str name="suggestAnalyzerFieldType">text_autocomplete</str>
      <str name="buildOnStartup">true</str>
      <str name="buildOnCommit">true</str>
    </lst>
    <lst name="suggester">
      <str name="name">default_fuzzy</str>
      <str name="lookupImpl">FuzzyLookupFactory</str>
      <str name="storeDir">suggester_fuzzy_data</str>
      <str name="dictionaryImpl">DocumentDictionaryFactory</str>
      <str name="field">autocomplete</str>
      <str name="weightField">weigh_dummy</str>
      <str name="suggestAnalyzerFieldType">text_autocomplete</str>
      <str name="buildOnStartup">true</str>
      <str name="buildOnCommit">true</str>
    </lst>
</searchComponent>

 

There is often the case (as it is with drupal), that your config files will contain files like schema_extra.xml, or solrconfig_extra.xml

In this case, your main schema.xml and solrconfig.xml will contain references to various fields and types that are defined in those extra files.

Therefore, you need to split your config files archive into multiple archives, and upload them as follows:

- First upload the extra files (zip up the schema_extra.xml and other *_extra.xml files and upload that zip first)

- Second upload the main schema.xml file, along with all other resource files, such as stopwords.txt, synonyms.txt, etc.

- Third, upload a zip archive that contains solrconfig.xml alone.

Solr works with a set of multiple configuration files.
Each Solr configuration file, has it's own purpose.

Therefore, in some cases, some publishers (CMS systems, etc), will chose to create their own structure for such Solr configuration files, such as, it is the case with Drupal, and maybe WordPress (WPSOLR), and others.

When uploading your solr configuration files, using your Opensolr Index Control Panel, it is, therefore, important to upload your files in a specific order:

  1. Create and upload a .zip archive containing all your dependency config files such as .txt files, schema-extra.xml solrconfig-extra.xml, pretty much, everything except schema.xml and solrconfig.xml
  2. Create and upload a .zip archive containing your schema.xml file since that defines all fields and uses references to the archive you uploaded before (that contains schema-extra.xml and others like that)
  3. Create and upload a .zip archive containing your solrconfig.xml file, since this one will have references to field definitions inside your schema.xml and other dependency files.

So, basically, you should simply create those 3 archives and upload them separately, in this exact order, and then everything should work.
You can, of course automate this, by using the Automation REST API to upload your config files.

ICU library erorrs

While uploading, or saving your config files, via the Opensolr editor, you might probably come across an error that reads something like this:

Error loading class 'solr.ICUCollationField'

The problem is that we will have to enable that library on our end.
Therefore, if you encounter this, please Contact Us in order to request the ICU library to be activated for your server.
If unsure what type of error you're getting, you can always see your Error Logs once you've uploaded your config files, by Clicking your Index Name->Tools->Error Logs

Opensolr Error Logs

Some issues to be mindful about.

Drupal, has changed the way it works, and now, for path, it only requires you to have a SLASH.
Basically, what we call Path (/solr/index_name), in Opensolr, it should be / for your Drupal setup.
And what Drupal calls SolrCore, should be the Name of your Opensolr Index.

Also, if you use Opensolr in SolrCloud mode, please note that the solr server path is /solrcloud instead of /solr. (i.e.: https://server.opensolr.com/solrcloud/index_name/select?...)

So, unless Drupal has decided to hard-code the /solr part of the connection URL, you should be able to use your Opensolr SolrCloud Index, and set your path, in Drupal as /solrcloud, instead of / (slash).

Ultimately, we will help you upload your config files, regardless of the Solr Version you decide to use, as usual, we'll do this for free, and, instantly, during our office hours.

You might also want to watch the clip about all external integration issues.

Opensolr connection to Drupal

It is a common misconception, that, if your Drupal module, or any other Solr plugin, requires, say, Solr Version 6.4.x, and Opensolr only provides Solr version 6.1.x, you can't use it with Opensolr.

With small modifications to your Solr config files, we can make your config files work on any of our Solr versions, without any impact on your Solr integration, or functionality overall.

Of course, migrating from one major Solr Version to another, isn't that straight forward, but, we can even, do that.
However, I can't stress enough that, migration between any Solr minor Versions is not only possible, it's even recommended, since we'll do it all for you, if you want to save time, and money, since, of course, Opensolr also offers FREE membership now.

It has also been a common misconception, that, your Opensolr connection URL returns 404 NOT FOUND, and therefore, it can not be used, and something must be broken.

In your Opensolr Control Panel, you will see your connection url as something like the following:
https://useast612.solrcluster.com/solr/opensolr/
Now, that URL, will always return an HTTP STATUS 404 NOT FOUND.

Ironically, that means everything is OK.
Your application will use that, as a base connection URL, and append other Solr Request Handlers to it, as you can see in this example:
https://useast612.solrcluster.com/solr/opensolr/select?q=*:*&wt=json&indent=true&start=0&rows=50&fl=title,description&fq=+content_type_text:(html)

We have added the /select Request Handler, and that very same connection URL is now responding with a full Solr json format response.

It's an OpenSolr Miracle! 

If you get the error:

Undefined field _text_

Please make sure to open up solrconfig.xml in your Opensolr Control Panel Admin UI and remove the reference to the _text_ field under the /update initParams:


    
    _text_

The setting you need has to do with formdataUploadLimitInKB that is found in solrconfig.xml
 
If you take a look at your solrconfig.xml file, there is an area that sets that up.
 
Open up your opensolr index:
https://opensolr.com/admin/solr_manager/tools/INDEX_NAME
Go to the Config Files Editor tab
Select solrconfig.xml
Scroll down until you see a directive like:
There, you'll see the settings:
multipartUploadLimitInKB="2048000"
formdataUploadLimitInKB="2048"
So,  is what you needed to change in your own solrconfig.xml file in your indexes.
 
 
HOWEVER #1: If you need a ridiculous amount of boolean clauses (as solr calls them), you won't be able to do it, since solr will always return an error: too many boolean clauses
You can fix that, by increasing the parameter maxBooleanClauses in your solrconfig.xml
 
 
HOWEVER #2: For better query performance, you could use or think of different alternative techniques to your issue, such as maybe split your very large 1 query into multiple queries and aggregate the results in your own app.
 
For example:
 
Query 1: 
https://opensolr-server.solrcluster.com/solr/production/select?q=*:*&fq=field_name:(VAL1 OR ... OR VALn)
- where n is a number that is less than the maxBooleanClauses set in your solrconfig.xml
 
Query 2: 
https://opensolr-server.solrcluster.com/solr/production/select?q=*:*&fq=field_name:(VALn+1 OR ... OR VALm)
- where m is a number that is less than the maxBooleanClauses set in your solrconfig.xml
.
.
.
Query i... (you get the point)
 
You then merge the results from all those queries.
You can figure out the number of queries you have to make by dividing the number of OR clauses you need by the maxBooleanClauses setting in your solrconfig.xml
 
And, needless to say, if you have your data replicated on multiple servers behind a load balancer (AKA our Opensolr Resilient Cluster Solution),  your multiple queries would be load balanced, which in return results in much faster queries and better resiliency.

YES, however, it's only active in some servers right now.
Please ask us to install that, or any other plugin solr library, by following the guide here, and we'll be happy to set it up for you.

You send it to support@opensolr.com and one of our tech people will install it in the server you're using (you'll need to specify the index name for us to be able to identify the server), within about 24 hours (usually just takes a couple of hours if the plugin is fully compatible, and all goes ok)

Make sure you send the .jar it's self, and would help if you also make sure that it's also compatible with the opensolr version you're currently using in your opensolr control panel.

Please don't send git repositories, as we won't build the .jar library on our end.

If you keep getting redirected to the Login page, or you are having troubles with placing a new order, after trying to login multiple times, please try to clear the opensolr cookies, or use a different browser.

   

Sometimes, in the shared opensolr cloud, the data folder may get corrupted, so it can't be read from or written into.
One easy fix for this, is to simply remove your index, and then just create another one, preferably under another name.

If that doesn't work, please contact us, and we'll be happy to fix it up for you.

Also, keep in mind, that there may be more reasons, so please make sure to check your error log, by clicking the Error Log button inside your opensolr index control panel, and keep refreshing that page to make sure the errors you'll see are accurate.

If you do see errors in there, please email them to us, at support@opensolr.com and we'll fix it for you.

 

Click on the Tooks Menu Item on the right hand side, and then simply use the form to create your query and delete data.

To move from using the managed-schema to schema.xml, simply follow the steps below:

In your solrconfig.xml file, look for the schemaFactory definition.If you have one, remove it and add this instead:

<schemaFactory class="ClassicIndexSchemaFactory"/>

If you don't have it just add the above snippet somewhere above the requestHandlers definitions. 

 

To move from using the classic schema.xml in your opensolr index, to the managed-schema simply follow the steps below:

 

In your solrconfig.xml, look for a SchemaFactory definition, and replace it with this snippet:

<schemaFactory class="ManagedIndexSchemaFactory">
    <bool name="mutable">true</bool>
    <str name="managedSchemaResourceName">managed-schema</str>
  </schemaFactory>

 

If you don't have any schemaFactory definition, just paste the above snippet to your solrconfig.xml file, just about any requestHandler definition.

Opensolr now provides the following solr versions:

- SOLR 3.6.5

- SOLR 4.0

- SOLR 4.10

- SOLR 5.1

- SOLR 6.1

- SOLR 7.0

- SOLR 8.0

However, any other solr version may be requested, for dedicated OpenSolr Cloud users

Solr Versions provided by Opensolr.com

 

If you usually get an error, such as: Unknown field... Or Missing field, and your schema.xml already contains those fields, make sure you disable the Schemaless mode in solrconfig.xml

Just head on to the Config Files Editor in your opensolr index control panel, and locate a snippet that looks like this:

class="ManagedIndexSchemaFactory"

According to the solr documentation, you can disable the ManagedIndexSchemaFactory as per the instructions below:

To disable dynamic schema REST APIs, use the following for :

<schemaFactory class="ClassicIndexSchemaFactory"/>

Also do not forget to remove the entire snippet regarding the ManagedIndexSchemaFactory, so that you won't accidentally use both.

Yes, Opensolr now supports the  JTS Topology Suite, by default, which does not come bundled with the default Solr distribution.
It should be enabled in most of our servers and datacenters, however, if you feel that doesn't work for your index, please Contact Support and we'll be happy to enable it for you.
No further setup will be required on your part.

Please go to https://opensolr.com/pricing and make sure you select the analytics option from the extra features tab, when you upgrade your account. 

If you can see analytics but no data, make sure your solr queries are correctly formated in the form:
https://server.opensolr.com/solr/index_name/select?q=your_query&other_params... 

So, the search query must be clearly visible in the q parameter in order for it to show in analytics. 

Here are a few ways to save your Monthy alloted Bandwidth:

  1. Using local caching, such as memcache, can greatly reduce the actual requests to make to our solr servers, thus saving you bandwidth
  2.  Since the allocated Bandwidth is PER INDEX, you could setup Solr Replication and setup your local application to perform round-robin requests in all of your replicas. This way, the bandwidth is saved by ballancing it between the ressources of multiple indexes. For example, if your account has 1 Gb PER INDEX, you will create Index A and replicate it onto Index B, and make requests in both of those in a round-robin fashion, thus gaining 2 Gb total of bandwidth for your index.
  3. Make sure your solr queries will return as little data as possible. For example use the rows or fl parameters for the solr /select requests, to only return the records and the fields you really need. Any other data the gets returned will be counted as extra bandwidth.

There are a couple things you might be able to do to trade performance for index size. For example, an integer (int) field uses less space than a trie integer (tint), but range queries will be slower when using an int.

To make major reductions in your index, you will almost certainly need to look more closely at the fields you are using.

  • Are you using a lot of stored fields? If so, try removing the stored fields from the index and query your database for the necessary data once you've got the results back from Solr.
  • Add omitNorms="true" to text fields that don't need length normalization
  • Add omitPositions="true" to text fields that don't require phrase matching
  • Special fields, like NGrams, can take up a lot of space
  • Are you removing stop words from text fields?