错误日志

FAILED: Error in acquiring locks: Lock acquisition for LockRequest(component:[LockComponent(type:EXCLUSIVE, level:TABLE, dbname:dw, tablename:dim1, operationType:INSERT, isAcid:false), LockComponent(type:SHARED_READ, level:TABLE, dbname:ods, tablename:ds_data2, operationType:SELECT), LockComponent(type:SHARED_READ, level:TABLE, dbname:ods, tablename:dim3, operationType:SELECT)], txnid:0, user:airflow, hostname:hubserver1, agentInfo:airflow_20190731065910_5be6e4a9-8e33-4b8d-aa0b-aa778cca3105) timed out after 5503478ms. LockResponse(lockid:69648, state:WAITING)
仔细分析得知实因为lockid:69648的锁导致运行失败。

hive锁

hive存在两种锁,共享锁Shared (S)和互斥锁Exclusive (X)。以上面的日志为例,执行的原sql大概是

insert overwrite table dim1 (bus_date='2019-07-29') select * from data2 t1 left join dim3 t2 on t1.id=t2.id;

读的两张表都是S锁,被insert的dim1是X锁。其中只触发s锁的操作可以并发的执行,只要有一个操作对表或者分区出发了x锁,则该表或者分区不能并发的执行作业。

SHOW LOCKS <TABLE_NAME>; --查看锁

问题解决

报错的Sql中加入下列语句
set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager;


一闪一闪
7 声望2 粉丝

引用和评论

0 条评论