== Physical Plan ==
CollectLimit (2)
+- Scan hive spark_catalog.default.alltypes (1)
(1) Scan hive spark_catalog.default.alltypes
Output [17]: [STRING#1363, DOUBLE#1364, INTEGER#1365, BIGINT#1366L, FLOAT#1367, DECIMAL#1368, NUMBER#1369, BOOLEAN#1370, DATE#1371, TIMESTAMP#1372, DATETIME#1373, BINARY#1374, ARRAY#1375, MAP#1376, STRUCT#1377, VARCHAR#1378, CHAR#1379]
Arguments: [STRING#1363, DOUBLE#1364, INTEGER#1365, BIGINT#1366L, FLOAT#1367, DECIMAL#1368, NUMBER#1369, BOOLEAN#1370, DATE#1371, TIMESTAMP#1372, DATETIME#1373, BINARY#1374, ARRAY#1375, MAP#1376, STRUCT#1377, VARCHAR#1378, CHAR#1379], HiveTableRelation [`spark_catalog`.`default`.`alltypes`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: [STRING#1363, DOUBLE#1364, INTEGER#1365, BIGINT#1366L, FLOAT#1367, DECIMAL#1368, NUMBER#1369, BOO..., Partition Cols: []]
(2) CollectLimit
Input [17]: [STRING#1363, DOUBLE#1364, INTEGER#1365, BIGINT#1366L, FLOAT#1367, DECIMAL#1368, NUMBER#1369, BOOLEAN#1370, DATE#1371, TIMESTAMP#1372, DATETIME#1373, BINARY#1374, ARRAY#1375, MAP#1376, STRUCT#1377, VARCHAR#1378, CHAR#1379]
Arguments: 1000