### What is the problem the feature request solves? This epic to track Comet native writer issues - [x] #2958 - [x] #2944 - [ ] #2957 - [x] #2890 - [ ] #2970 - [x] #2971 - [ ] #2985 - [ ] #3015 - [x] #3023 - [x] #3032 - [ ] #3041 - [ ] #3209 - [ ] #3521 - [ ] empty file should be skipped while write to file - [ ] INSERT INTO TABLE - complex type but different names - [ ] Insert overwrite table command should output correct schema: basic - [ ] parquet timestamp conversion - [ ] SPARK-29174 Support LOCAL in INSERT OVERWRITE DIRECTORY to data source - [ ] SPARK-33901: ctas should should not change table's schema - [ ] SPARK-37160: CREATE TABLE AS SELECT with CHAR_AS_VARCHAR - [ ] SPARK-38336 INSERT INTO statements with tables with default columns: positive tests - [ ] SPARK-38811 INSERT INTO on columns added with ALTER TABLE ADD COLUMNS: Positive tests - [ ] SPARK-43071: INSERT INTO from queries whose final operators are not projections - [ ] write path implements onTaskCommit API correctly - [ ] Write Spark version into Parquet metadata - [x] #3608 - [ ] respect parquet block size Additional tests which had to be ignored for spark 4 : - [ ] ctas with union - [ ] SPARK-48817: test multi inserts ### Describe the potential solution _No response_ ### Additional context _No response_
What is the problem the feature request solves?
This epic to track Comet native writer issues
CometColumnarToRow#2958INSERT OVERWRITE#2957OVERWRITEmode #2970object_store_settingssent from Scala #3032Additional tests which had to be ignored for spark 4 :
Describe the potential solution
No response
Additional context
No response