If I remember correctly ES connector can take optional query parameter. You
should be able to construct a ES query to specify fields and match
conditions.

On Thu, 21 Sep 2017 at 3:44 am, Jean Georges Perrin <j...@jgp.net> wrote:

> Same issue with RDBMS ingestion (I think). I solved it with views. Can you
> do views on ES?
>
> jg
>
>
> > On Sep 20, 2017, at 09:22, Kedarnath Dixit <
> kedarnath_di...@persistent.com> wrote:
> >
> > Hi,
> >
> > I want to get only select fields from ES using Spark ES connector.
> >
> > I have done some code which is fetching all the documents matching given
> index as below:
> >
> > JavaPairRDD<String, Map<String, Object>> esRDD = JavaEsSpark.esRDD(jsc,
> searchIndex);
> >
> > However, is there a way to only get specific fields from documents for
> every index in ES than getting everything ?
> >
> > Example: Let's say, I have many fields in the documents as below and I
> have @timestamp  which is also a field in the response { ..............,
> @timestamp=Fri  Jul 07 01:36:00 IST 2017, ..............}, Here how can I
> get the only  field @timestamp for all my indexes ?
> >
> > I could see something here but unable to correlate. can someone help me
> please ?
> >
> >
> > Many Thanks!
> > ~KD
> >
> >
> >
> >
> >
> >
> >
> >
> >  With Regards,
> >
> >  ~Kedar Dixit
> >
> >
> >
> >
> >  kedarnath_di...@persistent.com | @kedarsdixit | M +91 90499 15588 | T
> +91 (20) 6703 4783
> > Persistent Systems | Partners In Innovation | www.persistent.com
> >
> >
> > DISCLAIMER
> > ==========
> > This e-mail may contain privileged and confidential information which is
> the property of Persistent Systems Ltd. It is intended only for the use of
> the individual or entity to which it is addressed. If you are not the
> intended recipient, you are not authorized to read, retain, copy, print,
> distribute or use this message. If you have received this communication in
> error, please notify the sender and delete all copies of this message.
> Persistent Systems Ltd. does not accept any liability for virus infected
> mails.
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
> --
Best Regards,
Ayan Guha

Reply via email to