This is the official implementation of the paper:
Robust Client-Server Watermarking for Split Federated Learning
Split Federated Learning (SFL) is renowned for its privacy-preserving nature and low computational overhead among decentralized machine learning paradigms. In this framework, clients employ lightweight models to process private data locally and transmit intermediate outputs to a powerful server for further computation. However, SFL is a double-edged sword: while it enables edge computing and enhances privacy, it also introduces intellectual property ambiguity as both clients and the server jointly contribute to training. Existing watermarking techniques fail to protect both sides since no single participant possesses the complete model. To address this, we propose RISE, a Robust model Intellectual property protection scheme using client-Server watermark Embedding for SFL. Specifically, RISE adopts an asymmetric client–server watermarking design: the server embeds feature-based watermarks through a loss regularization term, while clients embed backdoor-based watermarks by injecting predefined trigger samples into private datasets. This co-embedding strategy enables both clients and the server to verify model ownership. Experimental results on standard datasets and multiple network architectures show that RISE achieves over 95% watermark detection rate (p-value<0.03) across most settings. It exhibits no mutual interference between client- and server-side watermarks and remains robust against common removal attacks.
Figure 1: An illustration of the RISE watermark Embedding and Verification scheme.
For the RISE scheme, run following code in the Terminal or just run the "run.sh" in the repository.
python ./poisoned_train.py --num_clients 10 --global_epochs 200 --bd_engage 0.0 --bd_type 2 --lr 0.001 --wm_engage 0.0 --lr_stages 4 --gamma 0.5 --local_epochs 5For clean SFL training, run following code in the Terminal.
python ./clean_train.py --num_clients 10 --global_epochs 200 --lr 0.001 --lr_stages 4 --gamma 0.5 --local_epochs 5For the free-rider experiments, run following code in the Terminal:
python ./freerider_train.py --num_clients 3 --global_epochs 200 --bd_engage 0.0 --bd_type 2 --lr 0.001 --wm_engage 0.0 --lr_stages 4 --gamma 0.5 --local_epochs 5For the single-side ablation experiments, run following codes in the Terminal:
- For RISE of client-side watermarking only:
python ./single_side_only_train.py --num_clients 10 --global_epochs 200 --bd_engage 0.0 --bd_type 2 --lr 0.001 --wm_engage 0.0 --lr_stages 4 --gamma 0.5 --local_epochs 5 --single_side_wm 1- For RISE of server-side watermarking only:
python ./single_side_only_train.py --num_clients 10 --global_epochs 200 --bd_engage 0.0 --bd_type 2 --lr 0.001 --wm_engage 0.0 --lr_stages 4 --gamma 0.5 --local_epochs 5 --single_side_wm 2 --sign_bit_length 40The standard CIFAR-10 and CIFAR-100 dataset can be found at the official site.
The code is implemented in Python 3.12.4 with following main dependencies:
numpy==2.0.0
pandas==2.3.2
Pillow==11.3.0
psutil==7.0.0
torch==2.8.0
torchvision==0.23.0
tqdm==4.67.1
The detailed dependencies can be found in the dependency_tee.txt.
Other main hardware and software environmental details can be found in Suppl 7.4 in the Supplementary Material of our paper.
