【1】 Expected only partition pruning predicates
解決方案:設置spark.sql.hive.metastorePartitionPruning=false
【2】 Error in query: Detected cartesian product for INNER join between logical plans
Project-Join condition is missing or trivial.
Use the CROSS JOIN syntax to allow cartesian products between these relations
解決方案:設置spark.sql.crossJoin.enabled=true
【3】 ERROR ApplicationMaster: User class threw exception: java.util.concurrent.TimeoutException: Futures timed out after [300 seconds]
- image.png
- image.png
解決方案:設置spark.sql.autoBroadcastJoinThreshold為-1,嘗試關閉BroadCast Join
【4】org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLException: No suitable driver found for
bigquery打包時,生成了spark-1.0.3的包,用它起thriftserver,里面邏輯涉及到訪問mysql時,報No suitable driver found for錯誤,看錯誤是沒拿到mysql的url。檢查jar包,common模塊的resource中的配置文件沒有打進去,看了下是spark模塊pom.xml打包時沒有加上common模塊的resource包。