I am trying to index some xml files into Solr 6.2.1 using their DataImportHandler.
For that purpose I have added the needed import and this RequestHandler into the solrconfig.xml:
<lib dir="${solr.install.dir:../../../..}/contrib/dataimporthandler/lib/" regex=".*\.jar" />
<lib dir="${solr.install.dir:../../../..}/dist/" regex="solr-dataimporthandler-.*\.jar" />
<requestHandler name="/dataimport" class="org.apache.solr.handler.dataimport.DataImportHandler" startup="lazy">
<lst name="default">
<str name="config">data-config.xml</str>
</lst>
</requestHandler>
Then I wrote the data-config.xml and put it into the same path as the solrconfig.xml:
<dataConfig>
<dataSource type="FileDataSource" encoding="UTF-8"/>
<document>
<entity name="pickupdir"
processor="FileListEntityProcessor"
dataSource="null"
baseDir="/vagrant/TREC8all/Adhoc/"
recursive="true"
fileName="^[\w\d-]+\.xml$" />
<entity name="trec8_simple"
processor="XPathEntityProcessor"
stream="true"
datasource="pickupdir"
url="${pickupdir.fileAbsolutePath}"
forEach="/DOCS/DOC">
<field column="id" xpath="/DOCS/DOC/DOCNO"/>
<field column="header" xpath="/DOCS/DOC/HEADER"/>
<field column="text" xpath="/DOCS/DOC/TEXT"/>
</entity>
</document>
</dataConfig>
This should make the ImportHandler iterate recursively through all xml files in the directory and index them according to the xpaths.
When I call the requestHandler like this: (I am running solr in a vagrant box instead of locally)
http://192.168.155.156:8983/solr/trec8/dataimport?command=full-import&entity=trec8_simple
I am getting this Exception in the solr.log:
ERROR (Thread-14) [ x:trec8] o.a.s.h.d.DataImporter Full Import failed:java.lang.NullPointerException
at org.apache.solr.handler.dataimport.DataImporter.createPropertyWriter(DataImporter.java:325)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:412)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:475)
at org.apache.solr.handler.dataimport.DataImporter.lambda$runAsync$0(DataImporter.java:458)
at java.lang.Thread.run(Thread.java:745)
Im assuming this should be the source for the DataImportHandler:
https://github.com/sudarshang/lucene-solr/blob/master/solr/contrib/dataimporthandler/src/java/org/apache/solr/handler/dataimport/DataImporter.java
I have trouble figuring out what is causing this exception and what it is meaning. Would be nice if somebody could help me out. Thanks!
EDIT:
I think this has something to do with the DataImportHandler not beeing able to finde the data-config.xml. When I remove it will throw the exact same exception
Ok I found the issue!
Problem was in the solrconfig,
<lst name="default">
<str name="config">data-config.xml</str>
</lst>
should have been
<lst name="defaults">
<str name="config">data-config.xml</str>
</lst>
Related
Configuring my solr server, am able to start and stop the server and can see the dashboard etc.
Just created a core called "Wish". So in the wish folder of the server I added the data source details in solrconfig.xml
Here is the essential part of it
<requestHandler name="/dataimport" class="org.apache.solr.handler.dataimport.DataImportHandler">
<lst name="defaults">
<str name="config">data-config.xml</str>
<lst name="datasource">
<str name="driver">com.mysql.jdbc.Driver</str>
<str name="url">jdbc:mysql://localhost:3306/wish</str> //db name wish
<str name="user">root</str>
<str name="password"></str>
</lst>
</lst>
</requestHandler>
And the data config xml file content id
<dataConfig>
<dataSource type="JdbcDataSource"
driver="com.mysql.jdbc.Driver"
url="jdbc:mysql://localhost:3306/wish"
user="root"
password=""/>
<document name="retailer">
<entity name="retailer" query="select * from retailer"></entity>
</document>
</dataConfig>
If you look at it I am just trying to add an entitiy retailer and my retailer table consists of 2 rows so far. But when I invoke the solr API request like below, nothing showing :(
Here is the API request
http://localhost:8983/solr/wish/select?q=*:*
but always the result is
<response>
<lst name="responseHeader">
<int name="status">0</int>
<int name="QTime">0</int>
<lst name="params">
<str name="q">*:*</str>
</lst>
</lst>
<result name="response" numFound="0" start="0"/>
</response>
Is something is wrong my config ? or the way I am invoking it wrong ?
Any clues and help will be much appreciated. Thanks in advance.
I've created a solr core with configurations and when I try to launch solr embedded server, I get the below error.
Caused by: java.io.IOException: Can't find resource 'solrconfig.xml' in
classpath or '/home/tharindu/Desktop/solr_tmp/custom/newsportal/collection1/conf'
at org.apache.solr.core.SolrResourceLoader.openResource(SolrResourceLoader.java:362)
at org.apache.solr.core.SolrResourceLoader.openConfig(SolrResourceLoader.java:308)
at org.apache.solr.core.Config.<init>(Config.java:117)
at org.apache.solr.core.Config.<init>(Config.java:87)
at org.apache.solr.core.SolrConfig.<init>(SolrConfig.java:167)
at org.apache.solr.core.SolrConfig.readFromResourceLoader(SolrConfig.java:145)
... 9 more
It seems that it is trying to find a solr core named collection1 by default.
The custom folder contains,
-- solr.xml
-- newsportal
-- conf
-- schema.xml
-- solrconfig.xml
-- core.properties
I'm using Spring solr template. The EmbeddedServer configuration is below.
#Bean
public EmbeddedSolrServerFactoryBean solrServerFactoryBean() {
EmbeddedSolrServerFactoryBean factory = new EmbeddedSolrServerFactoryBean();
factory.setSolrHome("/home/tharindu/Desktop/solr_tmp/custom/newsportal");
return factory;
}
#Bean
public SolrTemplate solrTemplate() throws Exception {
return new SolrTemplate(solrServerFactoryBean().getObject(), "newsportal");
}
When I change the EmbeddedServer bean as follows,(only changing the path of the core)
#Bean
public EmbeddedSolrServerFactoryBean solrServerFactoryBean() {
EmbeddedSolrServerFactoryBean factory = new EmbeddedSolrServerFactoryBean();
factory.setSolrHome("/home/tharindu/Desktop/solr_tmp/custom");
return factory;
}
I get the below error.
Caused by: org.apache.solr.common.SolrException: No such core:
at org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:112)
at org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:91)
at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301)
at org.springframework.data.solr.core.SolrTemplate$11.doInSolr(SolrTemplate.java:417)
at org.springframework.data.solr.core.SolrTemplate$11.doInSolr(SolrTemplate.java:414)
at org.springframework.data.solr.core.SolrTemplate.execute(SolrTemplate.java:141)
... 59 more
But when I rename the core folder as collection1 and change core name in the core.properties to name=collection1, everything works fine.
Below is my schema.xml and solrconfig.xml
<?xml version="1.0" encoding="UTF-8" ?>
<schema name="newsportal" version="1.5">
<types>
<fieldType name="string" class="solr.StrField" sortMissingLast="true" />
<fieldType name="text_general" class="solr.TextField" omitNorms="true">
<analyzer>
<tokenizer class="solr.WhitespaceTokenizerFactory"/>
<filter class="solr.LowerCaseFilterFactory" />
<filter class="solr.StopFilterFactory" words="stopwords_en.txt" />
</analyzer>
</fieldType>
</types>
<fields>
<field name="id" type="string" indexed="true" stored="true" required="true"/>
<field name="title" type="text_general" indexed="true" stored="true" required="true" termVectors="true"/>
<field name="description" type="text_general" indexed="true" stored="true" required="true" termVectors="true"/>
<field name="keywords" type="text_general" indexed="true" stored="true" multiValued="true" />
<defaultSearchField>keywords</defaultSearchField>
<copyField source="title" dest="keywords"/>
<copyField source="description" dest="keywords"/>
</fields>
<uniqueKey>id</uniqueKey>
</schema>
solrconfig.xml
<?xml version="1.0" encoding="UTF-8" ?>
<config>
<luceneMatchVersion>LUCENE_48</luceneMatchVersion>
<dataDir>${solr.data.dir:}</dataDir>
<directoryFactory name="DirectoryFactory" class="${solr.directoryFactory:solr.NRTCachingDirectoryFactory}" />
<codecFactory class="solr.SchemaCodecFactory" />
<schemaFactory class="ClassicIndexSchemaFactory" />
<indexConfig>
<lockType>${solr.lock.type:native}</lockType>
</indexConfig>
<updateHandler class="solr.DirectUpdateHandler2"/>
<query>
<maxBooleanClauses>1024</maxBooleanClauses>
<filterCache class="solr.FastLRUCache" size="512" initialSize="512" autowarmCount="0" />
<queryResultCache class="solr.LRUCache" size="512" initialSize="512" autowarmCount="0" />
<documentCache class="solr.LRUCache" size="512" initialSize="512" autowarmCount="0" />
<enableLazyFieldLoading>true</enableLazyFieldLoading>
<queryResultWindowSize>20</queryResultWindowSize>
<queryResultMaxDocsCached>200</queryResultMaxDocsCached>
<useColdSearcher>false</useColdSearcher>
<maxWarmingSearchers>2</maxWarmingSearchers>
</query>
<requestDispatcher handleSelect="false">
<requestParsers enableRemoteStreaming="true" multipartUploadLimitInKB="2048000" formdataUploadLimitInKB="2048" />
<httpCaching never304="true" />
</requestDispatcher>
<requestHandler name="/select" class="solr.SearchHandler" default="true">
<lst name="defaults">
<str name="sort">title asc</str>
<str name="echoParams">explicit</str>
<int name="rows">10</int>
<str name="q">*:*</str>
<bool name="facet">false</bool>
</lst>
</requestHandler>
<requestHandler name="/update" class="solr.UpdateRequestHandler"/>
<requestHandler name="/analysis/field" startup="lazy" class="solr.FieldAnalysisRequestHandler" />
<requestHandler name="/analysis/document" class="solr.DocumentAnalysisRequestHandler" startup="lazy" />
<requestHandler name="/admin/" class="solr.admin.AdminHandlers" />
<requestHandler name="/admin/ping" class="solr.PingRequestHandler">
<lst name="invariants">
<str name="q">*:*</str>
</lst>
<lst name="defaults">
<str name="echoParams">all</str>
</lst>
</requestHandler>
<admin>
<defaultQuery>*:*</defaultQuery>
</admin>
</config>
core.properties file
name=newsportal
EDIT
solr.xml file
<solr>
<solrcloud>
<str name="host">${host:}</str>
<int name="hostPort">${jetty.port:8983}</int>
<str name="hostContext">${hostContext:solr}</str>
<int name="zkClientTimeout">${zkClientTimeout:30000}</int>
<bool name="genericCoreNodeNames">${genericCoreNodeNames:true}</bool>
</solrcloud>
<shardHandlerFactory name="shardHandlerFactory"
class="HttpShardHandlerFactory">
<int name="socketTimeout">${socketTimeout:0}</int>
<int name="connTimeout">${connTimeout:0}</int>
</shardHandlerFactory>
</solr>
Solr version : 4.10.4
Spring solr data version : 1.5.5.BUILD-SNAPSHOT
I appreciate any help to resolve this issue.
Unfortunately, there is no clean approach how to do that because collection1 is harcoded in EmbeddedSolrServerFactory class (corresponding Jira ticket).
But I tried this hacky one and it works for me:
#Bean
public EmbeddedSolrServerFactoryBean solrClient() {
EmbeddedSolrServerFactoryBean embeddedSolrServerFactoryBean = new EmbeddedSolrServerFactoryBean() {
#Override
public EmbeddedSolrServer getObject() {
return (EmbeddedSolrServer) getSolrClient("<YOUR_CORE_NAME>");
}
};
embeddedSolrServerFactoryBean.setSolrHome("<YOUR_SOLR_HOME");
return embeddedSolrServerFactoryBean;
}
have a look at this url https://wiki.apache.org/solr/Solr.xml%20(supported%20through%204.x)
To enable support for dynamic SolrCore administration, place a file
named solr.xml in the solr.home directory. Here is an example solr.xml
file:
<solr persistent="true" sharedLib="lib">
<cores adminPath="/admin/cores">
<core name="core0" instanceDir="core0" />
<core name="core1" instanceDir="core1" />
</cores>
</solr>
Thus, change the file solr.xml with the an appropriate entry
<core name="core0" instanceDir="core0" />
Hope this helps
I have an index named LocationIndex in solr with fields as follows:
<fields>
<field name="solr_id" type="string" stored="true" required="true" indexed="true"/>
<field name="solr_ver" type="string" stored="true" required="true" indexed="true" default="0000"/>
// and some more fields
</fields>
<uniqueKey>solr_id</uniqueKey>
But now I want to change schema so that unique key must be composite of two already present fields solr_id and solr_ver... something as follows:
<fields>
<field name="solr_id" type="string" stored="true" required="true" indexed="true"/>
<field name="solr_ver" type="string" stored="true" required="true" indexed="true" default="0000"/>
<field name="composite-id" type="string" stored="true" required="true" indexed="true"/>
// and some more fields
</fields>
<uniqueKey>solr_ver-solr_id</uniqueKey>
After searching I found that it's possible by adding following to schema: (ref: Solr Composite Unique key from existing fields in schema)
<updateRequestProcessorChain name="composite-id">
<processor class="solr.CloneFieldUpdateProcessorFactory">
<str name="source">docid_s</str>
<str name="source">userid_s</str>
<str name="dest">id</str>
</processor>
<processor class="solr.ConcatFieldUpdateProcessorFactory">
<str name="fieldName">id</str>
<str name="delimiter">--</str>
</processor>
<processor class="solr.LogUpdateProcessorFactory" />
<processor class="solr.RunUpdateProcessorFactory" />
</updateRequestProcessorChain>
So I changed schema and finally it looks like:
<updateRequestProcessorChain name="composite-id">
<processor class="solr.CloneFieldUpdateProcessorFactory">
<str name="source">solr_ver</str>
<str name="source">solr_id</str>
<str name="dest">id</str>
</processor>
<processor class="solr.ConcatFieldUpdateProcessorFactory">
<str name="fieldName">id</str>
<str name="delimiter">-</str>
</processor>
<processor class="solr.LogUpdateProcessorFactory" />
<processor class="solr.RunUpdateProcessorFactory" />
</updateRequestProcessorChain>
<fields>
<field name="solr_id" type="string" stored="true" required="true" indexed="true"/>
<field name="solr_ver" type="string" stored="true" required="true" indexed="true" default="0000"/>
<field name="id" type="string" stored="true" required="true" indexed="true"/>
// and some more fields
</fields>
<uniqueKey>id</uniqueKey>
But while adding a document it's giving me error:
org.apache.solr.client.solrj.SolrServerException: Server at http://localhost:8983/solr/LocationIndex returned non ok status:400, message:Document [null] missing required field: id
I'm not getting what changes in schema are required to work as desired?
In a document I add, it contain fields solr_ver and solr_id. How and where it'll (solr) create id field by combining both these field something like solr_ver-solr_id?
EDIT:
At this link It's given how refer to this chain. Bu I'm unable to understand how would it be used in schema? And where should I make changes?
So it looks like you have your updateRequestProcessorChain defined appropriately and it should work. However, you need to add this to the solrconfig.xml file and not the schema.xml. The additional link you provided shows you how to modify your solrconfig.xml file and add your defined updateRequestProcessorChain to the current /update request handler for your solr instance.
So find do the following:
Move your <updateRequestProcessorChain> to your solrconfig.xml file.
Update the <requestHandler name="/update" class="solr.UpdateRequestHandler"> entry in your solrconfig.xml file and modify it so it looks like the following:
<requestHandler name="/update" class="solr.UpdateRequestHandler">
<lst name="defaults">
<str name="update.chain">composite-id</str>
</lst>
</requestHandler>
This should then execute your defined update chain and populate the id field when new documents are added to the index.
The described above solution may have some limitations, what if "dest" is over maximum length because concatenated fields are too long.
There is also one more solution with MD5Signature (A class capable of generating a signature String from the concatenation of a group of specified document fields, 128 bit hash used for exact duplicate detection)
<!-- An example dedup update processor that creates the "id" field on the fly
based on the hash code of some other fields. This example has
overwriteDupes set to false since we are using the id field as the
signatureField and Solr will maintain uniqueness based on that anyway. -->
<updateRequestProcessorChain name="dedupe">
<processor class="org.apache.solr.update.processor.SignatureUpdateProcessorFactory">
<bool name="enabled">true</bool>
<bool name="overwriteDupes">false</bool>
<str name="signatureField">id</str>
<str name="fields">name,features,cat</str>
<str name="signatureClass">org.apache.solr.update.processor.Lookup3Signature</str>
</processor>
<processor class="solr.LogUpdateProcessorFactory" />
<processor class="solr.RunUpdateProcessorFactory" />
</updateRequestProcessorChain>
From here: http://lucene.472066.n3.nabble.com/Solr-duplicates-detection-td506230.html
I'd like to add this as a comment, but it's impossible to get the creds these days... anyway, here is a better link:
https://wiki.apache.org/solr/Deduplication
OS: Ubuntu 12.04
I just upgraded my system from using Solr 1.4 to Solr 4.3.0 and can't seem to get the MySQL driver to work (or so I suspect). Solr seems to work fine (accessing it through the browser, etc) until I add the following lines to the solrconfig.xml
<requestHandler name="/dataimport" class="org.apache.solr.handler.dataimport.DataImportHandler">
<lst name="defaults">
<str name="config">/opt/solr/core01/conf/data-config.xml</str>
</lst>
</requestHandler>
After adding those lines into the config, restart Tomcat7 and refreshing my localhost:8080/solr, I get the following error:
HTTP Status 500 - {msg=SolrCore 'core01' is not available due to init failure: org/apache/solr/util/plugin/SolrCoreAware,trace=org.apache.solr.common.SolrException:
SolrCore 'core01' is not available due to init failure: org/apache/solr/util/plugin/SolrCoreAware at
And then a bunch of misc. filters, etc. I only see this error via the browser - I can't seem to find it in any of the logs for Tomcat.
Another thing to note is that I have included the required JAR files for the:
solr-dataimporthandler-extras-4.3.0.jar
solr-dataimporthandler-4.3.0.jar
as they seem to not come with the download for Solr 4.3.0
This problem has caused so much time to be wasted so hopefully someone on here can lend a hand and see what is wrong.
Thanks in advance!
EDIT:
This is what my data-config.xml looks like
<?xml version="1.0" encoding="UTF-8"?>
<dataConfig>
<dataSource type="JdbcDataSource" driver="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost/db_name" user="db_user" password="db_password" />
<document>
<entity name="table" query="SELECT * FROM table">
<field column="id" name="id" />
<field column="name" name="name" />
</entity>
</document>
</dataConfig>
With the obvious variables replaced with real data.
After looking at the DIHQuickStart page and looking at the error meessage, you might want to remove the path from your config setting, as Solr will look in the conf folder of the core where you have defined the DIH first by default. So change it to:
<str name="config">data-config.xml</str>
I am trying to import the MongoDB data into Solr for indexing, using SolrMongoDataImportHandler
These are the lines which I added to my solrconfig.xml
<lib path="../../dist/solr-mongo-importer-1.0.0.jar" />
<lib path="../../dist/mongo-2.9.3.jar" />
<lib path="../../dist/apache-solr-dataimporthandler-3.6.1" />
and the code for request handler
<requestHandler name="/dataimport" class="org.apache.solr.handler.dataimport.DataImportHandler">
<lst name="defaults">
<str name="config">data-config.xml</str>
</lst>
</requestHandler>
and this is my data-config.xml
<?xml version="1.0" encoding="UTF-8" ?>
<dataConfig>
<dataSource name="mymongo" type="MongoDataSource" database="cashlets" />
<document name="service">
<entity processor="org.apache.solr.handler.dataimport.MongoEntityProcessor"
query="{'Active':1}"
collection="Service"
datasource="mymongo"
transformer="org.apache.solr.handler.dataimport.MongoMapperTransformer" >
<field column="name" name="name" mongoField="Name"/>
</entity>
</document>
</dataConfig>
Now, when I try to do a full-import using http://localhost:8983/solr/dataimport?command=full-import
I get the following response
This XML file does not appear to have any style information associated with it. The document tree is shown below.
<response>
<lst name="responseHeader">
<int name="status">0</int>
<int name="QTime">2</int>
</lst>
<lst name="initArgs">
<lst name="defaults">
<str name="config">data-config.xml</str>
</lst>
</lst>
<str name="command">full-import</str>
<str name="status">idle</str>
<str name="importResponse"/>
<lst name="statusMessages">
<str name="Time Elapsed">0:0:45.486</str>
<str name="Total Requests made to DataSource">0</str>
<str name="Total Rows Fetched">0</str>
<str name="Total Documents Processed">0</str>
<str name="Total Documents Skipped">0</str>
<str name="Full Dump Started">2012-11-26 16:21:30</str>
<str name="">Indexing failed. Rolled back all changes.</str>
<str name="Rolledback">2012-11-26 16:21:30</str>
</lst>
<str name="WARNING">
This response format is experimental. It is likely to change in the future.
</str>
</response>
and the following error in the logs.
SEVERE: Full Import failed:java.lang.RuntimeException:
java.lang.RuntimeException: java.lang.ClassCastException:
org.apache.solr.handler.dataimport.DebugLogger$2 cannot be cast
ler.dataimport.MongoDataSource
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:264)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:375)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:445)
at org.apache.solr.handler.dataimport.DataImportHandler.handleRequestBody(DataImportHandler.java:205)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376)
at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260)
at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)
at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399)
at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:326)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: java.lang.RuntimeException: java.lang.ClassCastException:
org.apache.solr.handler.dataimport.DebugLogger$2 cannot be cast to
org.apache.solr.handler.dataimport.MongoD
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:621)
at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:327)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:225)
... 24 more
In the logs, it says that it is trying to cast org.apache.solr.handler.dataimport.DebugLogger$2 to org.apache.solr.handler.dataimport.MongoDBSource, but I am not able to find the reason why it is happening. All the configuration looks good. Is there anything else I need to do?