In this post, we will learn to integrate Solr instance with .net core application.
We will first introduce Solr based client-server architecture to understand how data should reside for Solr based search support. Then we will setup a Solr instance to run on a Docker container.
Finally, we will write code to connect to this container from a .net core application.
Introduction To Solr
Solr is a popular and highly efficient search engine platform. It was written in Java and is open sourced by the Apache Lucene project.
Solr is a standalone enterprise search server with a REST-like API. You put documents in it (called “indexing”) via JSON, XML, CSV or binary over HTTP. You query it via HTTP GET and receive JSON, XML, CSV or binary results.
https://lucene.apache.org/solr/features.html
Some Benefits Of Using Solr
- Full-text search
- High volume search support
- REST API based query support
- Advanced Admin interface

Solr Search Based Client-Server Architecture
In a typical client-server based architecture, here’s how things usually work.

A web app or a web API receives HTTP requests that are then converted into CRUD operations by the server that operate on the underlying database.
Now, with the introduction of Solr search engine, things are going to look a bit different.

Data Duplication
The first thing that we need to consider when integrating with Solr is that our data is going to be duplicated. The data resides in the persistent database as well as in the Solr core.
Of course we will not duplicate the entire data from the database. Only the data that should queried by your application’s search engine should be stored on the Solr core.
Consequence Of Data Duplication
Now in order to make sure that the needed application data resides in the Solr core, we need to make sure that the database and Solr core stays in sync all the time.
So, every add, delete and update operations that you perform on the database must be passed over to the Solr as well.
Performing Search Operation With Solr
In the absence of Solr search engine, we queried data directly on the underlying database. Now, with Solr, we perform all the queries on the Solr instance.
Depending upon your data structure, you might need to query further on the database as well based on the results of Solr query. However, the subsequent queries on the database would only be a basic select queries.
All the advanced and complex queries should be performed on the Solr search engine.
Run Solr Container Via Docker Compose
Create a service in your docker compose for creating a Solr instance and expose it on port 8983.
solr:
image: solr
ports:
- "8983:8983"
volumes:
- solrIndex:/opt/solr/server/solr/true_posts
entrypoint:
- solr-precreate
- my_core
Here, we have specified that we want to pre-create a solr core “my_core” whenever the docker creates a solr instance. A core is something similar to a database which is used to refer a single index in Solr.
After you run the docker compose up command, you should be able to access Solr admin panel from your machine.
http://hostname:8983/solr/
If you are running the app on your local machine, the hostname
is going to be localhost. On windows machine, you might have to put in your docker-machine’s ip which you can get via command:
docker-machine ls
You can do all sorts of things from the admin panel like querying data and analyzing requests. In this post however, we are going to focus on connecting to our Solr instance via a .NET core application.
Setup To Integrate Solr Instance With .NET Core Application
Solr Client Nuget Package For .NET Core
We will be making use of the SolrNet package which is a Solr client for .NET applications. Add reference to this library in your .net core application.
<PackageReference Include="SolrNet" Version="1.0.17" />
Defining Solr Document Schema
In Solr, the basic unit of data is called a document
which is a set of data that describes something. You can think of a document as something like a table row in a RDBMS. Each document has many fields. For example, a Contact
document can contain fields like name
, number
and address
.
We will define the document schema via a data model class.
Let’s say we have a database entity called Post
which looks like this below:
public class Post
{
public Post() {}
public string Id { get; set; }
public string Title { get; set; }
public string Description { get; set; }
public double Price { get; set; }
public bool IsActive { get; set; }
public DateTime PostedDate { get; set; }
public DateTime ExpiryDate { get; set; }
public DateTime IsSold { get; set; }
}
A Solr document data model for this same class would look like this:
public class SolrPostModel
{
public SolrPostModel() { }
public SolrPostModel(Post model)
{
this.Id = model.Id;
this.Description = model.Description;
this.IsActive = model.IsActive;
this.Price = model.Price;
this.Title = model.Title;
}
[SolrUniqueKey("id")]
public string Id { get; set; }
[SolrField("title")]
public string Title { get; set; }
[SolrField("description")]
public string Description { get; set; }
[SolrField("price")]
public double Price { get; set; }
[SolrField("isActive")]
public bool IsActive { get; set; }
}
Basically, we create fields for only those items that are going to be queryable later on.
Perform Add, Update And Delete On Solr Index
We have a working Solr document model now. Next we need a way to perform operations like add, update and delete with this model on the Solr instance.
public interface ISolrIndexService<T>
{
bool AddUpdate(T document);
bool Delete(T document);
}
We perform such operations with the help of ISolrOperations
which exposes all possible Solr operations.
public class SolrIndexService<T, TSolrOperations> : ISolrIndexService<T>
where TSolrOperations : ISolrOperations<T>
{
private readonly TSolrOperations _solr;
public SolrIndexService(ISolrOperations<T> solr)
{
_solr = (TSolrOperations) solr;
}
public bool AddUpdate(T document)
{
try
{
// If the id already exists, the record is updated, otherwise added
_solr.Add(document);
_solr.Commit();
return true;
}
catch (SolrNetException ex)
{
//Log exception
throw ex;
return false;
}
}
public bool Delete(T document)
{
try
{
//Can also delete by id
_solr.Delete(document);
_solr.Commit();
return true;
}
catch (SolrNetException ex)
{
//Log exception
return false;
}
}
}
The SolrIndexService
is a generic class which can be re-used for any Solr document model.
Now, every time you create a new object in your database, you should add a document in the Solr instance as well.
...
// in your database service class
// define an instance of SolrIndexService which should be injected from the constructor
ISolrIndexService<SolrPostModel> solrIndexService;
// create post
public void Create(Post model)
{
// insert into db first
....
// now insert into Solr index
solrIndexService.AddUpdate(new SolrPostModel(post));
}
You can repeat similar implementations for update and delete operations.
Inject SolrIndexService From Startup In .NET Core
In the example above we had assumed that the SolrIndexService
is going to be injected in the service layer. We can configure this from Startup
class in a .net core application.
...
// inside ConfigureServices method
services.AddSolrNet<SolrPostModel>($"http://localhost/solr/my_posts");
services.AddScoped<ISolrIndexService<SolrPostModel>, SolrIndexService<SolrPostModel, ISolrOperations<SolrPostModel>>>();
Wrapping Up
This concludes the part of connecting our Solr instance with a .NET core application. We created a Solr instance that runs on a Docker container. Then we created a data model to represent Solr document. Finally, we created methods like add and delete to ensure that the data between database and Solr instance stays in sync.
In the next part, we will see how we can integrate Solr search in our application.
Next Part: Coming Soon!
Check out our other .NET Related Articles as well.
Have you used ‘Manual mapping’ with your code?
It looks more accurate and you will not need SolrPostModel entity in this case.
I am not quite sure what do you mean by manual mapping. Can you provide more details and also why do you think “manual mapping” is more accurate?