Executing a Spark-Scala command with a special character fails with the below error.
val joinedDF = matchMap.join(qubMap,"custId").sort($"valueinUSD".desc).limit(100000)
error: value $ is not a member of StringContext
Spark doesn't recognize the $ sign because the required sql context was not imported.
Reference open source documentation related to sqlContext.implicits.*