== Physical Plan ==
CollectLimit (2)
+- Scan hive spark_catalog.default.alltypes (1)
(1) Scan hive spark_catalog.default.alltypes
Output [17]: [STRING#4006, DOUBLE#4007, INTEGER#4008, BIGINT#4009L, FLOAT#4010, DECIMAL#4011, NUMBER#4012, BOOLEAN#4013, DATE#4014, TIMESTAMP#4015, DATETIME#4016, BINARY#4017, ARRAY#4018, MAP#4019, STRUCT#4020, VARCHAR#4021, CHAR#4022]
Arguments: [STRING#4006, DOUBLE#4007, INTEGER#4008, BIGINT#4009L, FLOAT#4010, DECIMAL#4011, NUMBER#4012, BOOLEAN#4013, DATE#4014, TIMESTAMP#4015, DATETIME#4016, BINARY#4017, ARRAY#4018, MAP#4019, STRUCT#4020, VARCHAR#4021, CHAR#4022], HiveTableRelation [`spark_catalog`.`default`.`alltypes`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: [STRING#4006, DOUBLE#4007, INTEGER#4008, BIGINT#4009L, FLOAT#4010, DECIMAL#4011, NUMBER#4012, BOO..., Partition Cols: []]
(2) CollectLimit
Input [17]: [STRING#4006, DOUBLE#4007, INTEGER#4008, BIGINT#4009L, FLOAT#4010, DECIMAL#4011, NUMBER#4012, BOOLEAN#4013, DATE#4014, TIMESTAMP#4015, DATETIME#4016, BINARY#4017, ARRAY#4018, MAP#4019, STRUCT#4020, VARCHAR#4021, CHAR#4022]
Arguments: 1000