spark sql嵌套查询语法问题

spark sql中运行报错

SQL:

SELECT his.name, his.oid FROM his_data_zadd AS his WHERE his.value=(SELECT MAX(temp_t.value) FROM his_data_zadd AS temp_t)

报错信息:

py4j.protocol.Py4JJavaError: An error occurred while calling o32.sql.
: java.lang.RuntimeException: [1.76] failure: ``)'' expected but identifier MAX found

SELECT his.name, his.oid FROM his_data_zadd AS his WHERE his.value=(SELECT MAX(temp_t.value) FROM his_data_zadd AS temp_t)

是spark sql的语法解析器不支持这种写法?还是我的sql语句原本就有问题?

阅读 9.1k
2 个回答

试下把SQLContext换成HiveContext

spark2.2.1 spark sql支持hive语法,支持子查询
比如

SELECT aa.user_id,
       aa.buyTimes,
       aa.sumOrderAmount
FROM  (SELECT user_id,
              Count(1)          AS buyTimes,
              Sum(a.real_price) AS sumOrderAmount,
              Max(real_price)   AS maxPrice
       FROM   global_temp.order a
       WHERE  1 = 1
          AND a.user_id = (SELECT Max(temp.user_id)
                           FROM   global_temp.order AS temp)
          AND a.status != 0
       GROUP  BY a.user_id)aa
WHERE  1 = 1
LIMIT  50

能正确执行

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进
推荐问题