With Spark 2.0.0, the Spark session / SQL context / hive context have all been amalgamated into a single object. Right now, hive_context() returns that:
> hive_context(sc)
<jobj[8]>
class org.apache.spark.sql.SparkSession
org.apache.spark.sql.SparkSession@65b602ec
But we should be able to access this with a plain spark_session(sc) call.