But somehow Spark interpreter works without this problem. Why is that the case 
if the problem is on Zeppelin UI side?

From: Jeff Zhang <zjf...@gmail.com>
Reply-To: "users@zeppelin.apache.org" <users@zeppelin.apache.org>
Date: Friday, November 3, 2017 at 4:48 PM
To: "users@zeppelin.apache.org" <users@zeppelin.apache.org>
Subject: Re: Livy interpreter Scala code string interpolation


Yes, this is due to the dynamic forms of zeppelin.


Mohit Jaggi <mohitja...@gmail.com<mailto:mohitja...@gmail.com>>于2017年11月4日周六 
上午2:28写道:
I run into similar issues with shell scripts that use ${var}. Can we use a 
different magic notation for Z, one that has a lower chance of colliding with 
shell scripts and Scala code? This will be hard to do due to the variety of 
interpreters supported. Perhaps this can be made configurable?

On Fri, Nov 3, 2017 at 10:36 AM, Tan, Jialiang 
<j...@ea.com<mailto:j...@ea.com>> wrote:
Hi,

When writing Spark Scala code in zeppelin with Livy interpreter, string 
interpolation does not seem to work. For example:

val devices = sc.objectFile[(VertexId, DeviceAttr)](
s"s3a://${bucket}/${datasetS3Prefix}/${tableName} 
/dt=${end_dt}/tid=${tid}/${datasetS3Suffix}")

I think the dollar curly brackets were interpreted as Javascript template 
literals and hence disappear when Livy server gets the message and becomes:

val devices = sc.objectFile[(VertexId, DeviceAttr)](
   s"s3a://///dt= /tid= / ")

At the same time a couple of input boxes appears at the bottom of this 
paragraph in Zeppelin UI having the corresponding titles in the original dollar 
curly brackets.

Is this a bug? How can we overcome the string interpolation issue?

Thanks,
Jialiang

Reply via email to