== Physical Plan ==
CollectLimit (2)
+- Scan hive spark_catalog.default.alltypes (1)
(1) Scan hive spark_catalog.default.alltypes
Output [17]: [STRING#3514, DOUBLE#3515, INTEGER#3516, BIGINT#3517L, FLOAT#3518, DECIMAL#3519, NUMBER#3520, BOOLEAN#3521, DATE#3522, TIMESTAMP#3523, DATETIME#3524, BINARY#3525, ARRAY#3526, MAP#3527, STRUCT#3528, VARCHAR#3529, CHAR#3530]
Arguments: [STRING#3514, DOUBLE#3515, INTEGER#3516, BIGINT#3517L, FLOAT#3518, DECIMAL#3519, NUMBER#3520, BOOLEAN#3521, DATE#3522, TIMESTAMP#3523, DATETIME#3524, BINARY#3525, ARRAY#3526, MAP#3527, STRUCT#3528, VARCHAR#3529, CHAR#3530], HiveTableRelation [`spark_catalog`.`default`.`alltypes`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: [STRING#3514, DOUBLE#3515, INTEGER#3516, BIGINT#3517L, FLOAT#3518, DECIMAL#3519, NUMBER#3520, BOO..., Partition Cols: []]
(2) CollectLimit
Input [17]: [STRING#3514, DOUBLE#3515, INTEGER#3516, BIGINT#3517L, FLOAT#3518, DECIMAL#3519, NUMBER#3520, BOOLEAN#3521, DATE#3522, TIMESTAMP#3523, DATETIME#3524, BINARY#3525, ARRAY#3526, MAP#3527, STRUCT#3528, VARCHAR#3529, CHAR#3530]
Arguments: 1000