I have a MySQL table:
CREATE TABLE documents (
id INT NOT NULL AUTO_INCREMENT,
I'm looking into chunking my data source for optimial data import into solr and was wondering if it was possible to use a master url that chunked data into sections.
For example ...
I am getting error when i use array_to_string(array()) for multivalue.
My entity like this
< entity dataSource="ds-1" name="SolrTable" pk="Item_ID"
I have MySql database for my application. i implemented solr search and used dataimporthandler(DIH)to index data from database into solr. my question is: is there any way that if database ...
When I run a full-import it is only indexing 1 document. In the logs I see it processing most of the records (~300 records). I don't see any errors ...
When I run the "Full import with cleaning" command, error is "Indexing failed. Rolled back all changes"
My dataimport config file:
<dataSource type="JdbcDataSource" name="ds-1" driver="com.mysql.jdbc.Driver" url="jdbc:mysql://my.ip/my_db" user="my_db_user" password="my_password" readOnly="True"/>