Hello, If you have file with diferents types of data, it's prefered to use other type of file like TSV, ORC or Parquet.
Best, On Fri, Oct 30, 2015 at 4:16 PM, Vijaya Narayana Reddy Bhoomi Reddy < vijaya.bhoomire...@whishworks.com> wrote: > Hi, > > I have a CSV file which contains hunderd thousand rows and about 200+ > columns. Some of the columns have free text information, which means it > might contain characters like comma, colon, quotes etc with in the column > content. > > What is the best way to load such CSV file into Hive? > > Another serious issue, I have stored the file in a location in HDFS and > then created an external hive table on it. However, upon running Create > external table using HDP Hive View, the original CSV is no longer present > in the folder where it is meant to be. Not sure on how HDP processes and > where it is stored? My understanding was that EXTERNAL table wouldnt be > moved from their original HDFS location? > > Request someone to help out! > > > Thanks & Regards > Vijay > > > > The contents of this e-mail are confidential and for the exclusive use of > the intended recipient. If you receive this e-mail in error please delete > it from your system immediately and notify us either by e-mail or > telephone. You should not copy, forward or otherwise disclose the content > of the e-mail. The views expressed in this communication may not > necessarily be the view held by WHISHWORKS. -- *Daniel Lopes, B.Eng* Data Scientist - BankFacil CREA/SP 5069410560 <http://edital.confea.org.br/ConsultaProfissional/cartao.aspx?rnp=2613651334> Mob +55 (18) 99764-2733 <callto:+5518997642733> Ph +55 (11) 3522-8009 http://about.me/dannyeuu Av. Nova Independência, 956, São Paulo, SP Bairro Brooklin Paulista CEP 04570-001 https://www.bankfacil.com.br