Select a category on the left, to get your answers quickly
Welcome to the future of hassle-free content delivery and search performance.
Opensolr now proudly offers professional Drupal hosting alongside our battle-tested Solr Cloud platform, trusted for over a decade. Whether you're running a small blog or a massive enterprise site, we’ve got the backbone to keep it fast, secure, and always online.
🚀 Optimized Performance
Our infrastructure is finely tuned for Drupal and Solr workloads. You get lightning-fast page loads and blazing search response times.
🔒 Enterprise-Grade Reliability
With resilient Solr clusters and load-balanced Drupal stacks, your content stays live, even during peak traffic or hardware failures.
🔁 Automatic Scaling & Replication
We offer horizontal scaling, auto-replication, and load balancing out of the box — so your platform grows with you, not against you.
🛠️ Fully Managed DevOps
Forget about downtime, server maintenance, or updates. We take care of the tech so you can focus on content and community.
🌐 GitHub/GIT Integration & CI/CD
We’ll set up and manage your Git repository, making development smoother and safer with streamlined version control and deployment workflows.
🔄 DR & Staging Environments
Need dev replicas? No problem. We can spin up full DR (Disaster Recovery) and staging environments for both Drupal and Solr, so your team can test confidently before going live.
📞 White-Glove Support
Personalized onboarding, 24/7 monitoring, and seasoned engineers ready to step in — no ticket gets left behind.
We make it simple. Just contact us with a few details about your Drupal setup, and we’ll handle the rest.
Whether you're migrating, scaling, or starting from scratch — we’ll design, build, and manage your entire stack, from content to search, like clockwork.
Say goodbye to outages, downtime, and tech headaches.
Say hello to speed, scalability, and serenity.
🧠 OpenSolr — Powering smarter Drupal hosting, one cluster at a time.
Drupal’s Search API and Facet API often generate large, complex Solr queries like this:
?q={!boost b=boost_document} *:* &
fq=(+available:true +display_in_catalog:Yes +(*:* -facet_field:X)) &
facet.field=field_languages &
facet.field=field_formats &
facet.field=field_categories &
facet.limit=-1 &
facet.mincount=1 &
rows=20
From Solr’s debug section:
SimpleFacetsFCS (FieldCache Scanning)facet.limit=-1, facet.missing=falseSo, the query itself is fine — it’s the faceting that eats up most of the runtime.
*:* Query PatternDrupal often sends {!boost b=boost_document} *:*, meaning “match everything.”
Even with filters, Solr still evaluates the entire index before narrowing results.
That’s expensive, especially on big collections.
fq FiltersDozens of filters like:
fq=field_a:valueA fq=field_b:valueB fq=(+available:true +display_in_catalog:Yes)
Each one adds cost — especially negations like (*:* -field:value) or range queries like [ * TO * ].
Fields such as field_language, field_format, field_category, etc. are all requested, often with:
facet.limit = -1 facet.missing = false facet.sort = count
Each of these facets has to count every term across every matching document. That’s the main bottleneck — the facet block alone took ~1700ms.
Solr’s FCS method scans the entire field value space. When many documents or multi-valued fields are involved, it’s slow. The newer enum or stream methods, or JSON facets, are more efficient.
In Configuration → Search API → View Modes, disable facets that aren’t visible on the page.
Every active facet triggers Solr work, even if Drupal doesn’t render it.
In Facet API settings, set:
facet.limit = 50 facet.mincount = 1
Avoid facet.limit=-1 — that means “return everything.”
facet.missing=falseThis setting forces Solr to calculate missing values. Unless your UX needs it, remove it.
If your Solr core is 7.x or newer, switch Drupal’s Search API backend to use the JSON Facet API.
It’s faster, parallelized, and supports streaming facet counts.
Enable Drupal’s Search API query cache and facet cache.
It prevents Solr from recomputing identical facets for repeated queries.
Ensure your Solr solrconfig.xml includes:
<filterCache class="solr.FastLRUCache" size="512" initialSize="512" autowarmCount="128"/>
This helps repeated filters (like available:true) resolve instantly.
facet.threadsIf you must run many facets:
facet.threads=4
Parallelizes field facet computation.
Ensure all facet fields are stored with docValues="true" in the schema:
<field name="field_languages" type="string" stored="true" indexed="true" docValues="true"/>
This speeds up both faceting and sorting.
Switch from classic facet.field to:
json.facet={ categories: { type: "terms", field: "field_category", limit: 50 }, formats: { type: "terms", field: "field_format", limit: 50 } }
These APIs skip legacy overhead.
From your debug trace:
"facet-debug": { "elapse": 1704, "processor": "SimpleFacets", "appliedMethod": "FCS", "inputDocSetSize": 10 }
Even with only 10 returned documents, the facet engine must still traverse the full dataset to count terms. This is why even small result sets can be slow when facets are many or unbounded.
| Area | Action | Typical Gain |
|---|---|---|
| Drupal Facet Limit | Set facet.limit=50 |
60–80% faster |
Remove facet.missing |
Disable | 5–15% faster |
| Enable caching | Drupal + Solr | Big |
| Use DocValues | Schema config | Major |
| Use JSON facets | Modern Solr | 2–5× faster |
Avoid *:* base queries |
Use tighter filters | Major |
| facet.threads=4 | Solr config | 20–40% faster |
Before:
?q={!boost b=boost_document} *:*&facet.limit=-1&facet.field=...
After:
?q=*:*&fq=available:true&fq=display_in_catalog:Yes&
facet.limit=50&
facet.threads=4&
json.facet={
languages:{type:terms,field:field_language,limit:50},
formats:{type:terms,field:field_format,limit:50}
}
The slowdown isn’t Solr being “slow” — it’s Drupal asking for everything at once.
By trimming unnecessary facets, tightening query filters, and enabling JSON facets with DocValues, you can cut load time by 80–90% without losing functionality.
This tutorial applies to any Drupal + Apache Solr setup where complex facet-heavy queries are generated by the Search API or Facet API modules.
Want to hook up your Drupal site with blazing-fast, scalable search? You’re in the right place.
Whether you’re using Search API Opensolr (recommended for those who love “easy mode”) or Search API Solr, setup is a breeze—if you know what to watch for!
You can connect Drupal to Opensolr using either:
Either way, setup should be pretty straightforward.
If you’re using Search API Opensolr, you get access to AutoConfiguration:
Maybe you like living on the edge. Here’s what you must do to keep your data safe and your setup smooth:
Choose the Correct Solr Version
Picking the wrong Solr version can turn your search into a tragic drama (data loss, broken indexes, existential dread).


Handle HTTP Authentication Like a Pro
All Opensolr Indexes are protected by Basic HTTP Auth.

Download the Right Solr Configuration Zip
Make sure the config archive you download from Drupal matches your Solr version on Opensolr.
Then, upload it to your Opensolr Index Control Panel.

Create Your Index & Start Indexing
Once that’s done, create your index in Drupal and let the indexing begin! (Search glory awaits.)
Now go forth and let your Drupal site search like a champion—with the least possible drama!





![]() |
Drupal - Search API Opensolr Opensolr, is Drupal-Ready with, the Search API Opensolr Module. Extends the main Search API Solr module and provides functionality for connecting and managing Solr using the Opensolr services. Download Now |
Some issues to be mindful about.
Drupal, has changed the way it works, and now, for path, it only requires you to have a SLASH.
Basically, what we call Path (/solr/index_name), in Opensolr, it should be / for your Drupal setup.
And what Drupal calls SolrCore, should be the Name of your Opensolr Index.
Also, if you use Opensolr in SolrCloud mode, please note that the solr server path is /solrcloud instead of /solr. (i.e.: https://server.opensolr.com/solrcloud/index_name/select?...)
So, unless Drupal has decided to hard-code the /solr part of the connection URL, you should be able to use your Opensolr SolrCloud Index, and set your path, in Drupal as /solrcloud, instead of / (slash).
Ultimately, we will help you upload your config files, regardless of the Solr Version you decide to use, as usual, we'll do this for free, and, instantly, during our office hours.
You might also want to watch the clip about all external integration issues.
It is a common misconception, that, if your Drupal module, or any other Solr plugin, requires, say, Solr Version 6.4.x, and Opensolr only provides Solr version 6.1.x, you can't use it with Opensolr.
With small modifications to your Solr config files, we can make your config files work on any of our Solr versions, without any impact on your Solr integration, or functionality overall.
Of course, migrating from one major Solr Version to another, isn't that straight forward, but, we can even, do that.
However, I can't stress enough that, migration between any Solr minor Versions is not only possible, it's even recommended, since we'll do it all for you, if you want to save time, and money, since, of course, Opensolr also offers FREE membership now.
It has also been a common misconception, that, your Opensolr connection URL returns 404 NOT FOUND, and therefore, it can not be used, and something must be broken.
In your Opensolr Control Panel, you will see your connection url as something like the following:
https://chicago96.solrcluster.com/solr/opensolr/
Now, that URL, will always return an HTTP STATUS 404 NOT FOUND.
Ironically, that means everything is OK.
Your application will use that, as a base connection URL, and append other Solr Request Handlers to it, as you can see in this example:
https://chicago96.solrcluster.com/solr/opensolr/select?q=*:*&wt=json&indent=true&start=0&rows=50&fl=title,description&fq=+content_type:html
We have added the /select Request Handler, and that very same connection URL is now responding with a full Solr json format response.
It's an OpenSolr Miracle!
Here are a few things to check if your Drupal won't connect to your Opensolr Index/Cluster.
1. Make sure you are using the latest Search API Opensolr module from the Drupal community, if you are using Drupal 8+ as this will make things much easier for you.
2. In your Drupal Solr Server settings, please select Solr with Basic AUTH OR Opensolr With Basic Auth
3. DO NOT forget to set the Opensolr Index AUTH Credentials, in your Drupal Solr Server Settings page, as it is described here
4. Try to set a shorter password for your Opensolr Index by going to your Index Security Tab in your Opensolr Index Control Panel (See image below). Replace INDEX_NAME with your own index name. (https://opensolr.com/admin/solr_manager/tools/INDEX_NAME#security)
Then try to use the new credentials in your Drupal Settings Page.
5. Make sure you upload the Solr configuration Files, provided by Drupal, in your Opensolr Index Control Panel, at: https://opensolr.com/admin/solr_manager/tools/INDEX_NAME#configuration (Replace INDEX_NAME with your own index name)
6. Make sure you use the correct connection parameters as indicated in the Opensolr Index Control Panel Dashboard: https://opensolr.com/admin/solr_manager/tools/INDEX_NAME#overview (Replace INDEX_NAME with your own index name)

In this clip we will talk about some of the most common drupal configuration issues, based on the solr configuration files provided by the Drupal search api module.
Error is:
SOLUTION:
(replace INDEX_NAME with your own Opensolr INDEX NAME)
That's all because you are using updateLog in your solrconfig.xml

Also, if you go here:
Drupal 6.x Known issue:
Looking at solrconfig_extra.xml you will find stuff like this:
<str name="accuracy">0.5</str>
<str name="maxEdits">2</str>
<str name="minPrefix">1</str>
<str name="maxInspections">5</str>
<str name="minQueryLength">4</str>
<str name="maxQueryFrequency">0.01</str>
<str name="thresholdTokenFrequency">.01</str>
You will have to change it to stuff like this:
<int name="maxEdits">2</int>
<int name="minPrefix">1</int>
<int name="maxInspections">5</int>
<int name="minQueryLength">4</int>
<float name="maxQueryFrequency">0.01</float>
<float name="thresholdTokenFrequency">.01</float>
I hope it's obvious why...
And please try to find more occurrences of integers and floats being defined as strings, in the drupal config files... I'm sure there must be more erors like these, in other configs for other solr versions as well in drupal.