The 4100S datasheet does directly address these specs but you should be able to infer some of them.
There is no setup time specified except for SWDI. In general with I2C, the SDA from the sender is changed on the SCL falling edge and sampled on the SCL rising edge. Since the max bit rate is 1Mbps then the expected time from the SDA bit change on SCL falling edge to SCL rising edge is 0.5us. This should not be a problem for SDA 1->0 transitions since the SDA driver should be open drain which actively drives a low strongly. The SCL fall edge to SDA should be sub-30ns. The problem might be a problem @ 1 Mbps for 0->1 transitions. This is because the driver passively allows the signal to 'float' to 1. This 'float' rise time is dependent on the SDA total capacitance and bias pull-up resistance. If you follow the I2C convention for Csda and Rsda you should not have any problem. At lower bit rates, it definitely should not be a problem.
There is no hold time specified except for SWDI. The hold time for GPIO should be very close to 0ns. Since the input buffer has a propagation delay as well as setting the Sync Mode: setting to Single- or Double-sync to the BUS_CLK, the hold time is built in automatically. This shouldn't be an issue.
Here is a short list of general GPIO rise and fall times based on strong drive.
For you this would apply if you are implementing a I2C master (SCL and SDA) or a I2C slave (for SDA sends).
Since I2C is active low (1->0 transition) this fall time of 25ns is insignificant to the 500ns setup time provided at 1Mbps.
Since I2C is passive (float) high (0->1 transition) this rise time varies depending on the SDA capacitance and bias pull-up resistance. Following the recommended I2C for the desired data rate should be sufficient for reliable data exchange. The committee that specified Csda and Rsda is using a RC-based exponential prediction to establish these values. Therefore you can calculate (simulate) these same design parameters yourself.
Len "Engineering is an Art. The Art of Compromise."