Home
last modified time | relevance | path

Searched refs:smp_mb__after_spinlock (Results 1 – 21 of 21) sorted by relevance

/Linux-v6.1/tools/memory-model/litmus-tests/
DMP+polockmbonce+poacquiresilsil.litmus6 * Do spinlocks combined with smp_mb__after_spinlock() provide order
18 smp_mb__after_spinlock();
DZ6.0+pooncelock+poonceLock+pombonce.litmus6 * This litmus test demonstrates how smp_mb__after_spinlock() may be
27 smp_mb__after_spinlock();
DREADME74 Protect the access with a lock and an smp_mb__after_spinlock()
153 As above, but with smp_mb__after_spinlock() immediately
/Linux-v6.1/kernel/kcsan/
Dselftest.c155 KCSAN_CHECK_READ_BARRIER(smp_mb__after_spinlock()); in test_barrier()
184 KCSAN_CHECK_WRITE_BARRIER(smp_mb__after_spinlock()); in test_barrier()
216 KCSAN_CHECK_RW_BARRIER(smp_mb__after_spinlock()); in test_barrier()
Dkcsan_test.c573 KCSAN_EXPECT_READ_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
618 KCSAN_EXPECT_WRITE_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
663 KCSAN_EXPECT_RW_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
/Linux-v6.1/arch/xtensa/include/asm/
Dspinlock.h18 #define smp_mb__after_spinlock() smp_mb() macro
/Linux-v6.1/arch/csky/include/asm/
Dspinlock.h10 #define smp_mb__after_spinlock() smp_mb() macro
/Linux-v6.1/arch/arm64/include/asm/
Dspinlock.h12 #define smp_mb__after_spinlock() smp_mb() macro
/Linux-v6.1/arch/powerpc/include/asm/
Dspinlock.h14 #define smp_mb__after_spinlock() smp_mb() macro
/Linux-v6.1/arch/riscv/include/asm/
Dbarrier.h72 #define smp_mb__after_spinlock() RISCV_FENCE(iorw,iorw) macro
/Linux-v6.1/include/linux/
Dspinlock.h174 #ifndef smp_mb__after_spinlock
175 #define smp_mb__after_spinlock() kcsan_mb() macro
/Linux-v6.1/tools/memory-model/
Dlinux-kernel.bell33 'after-spinlock (*smp_mb__after_spinlock*) ||
Dlinux-kernel.def25 smp_mb__after_spinlock() { __fence{after-spinlock}; }
/Linux-v6.1/tools/memory-model/Documentation/
Drecipes.txt160 of smp_mb__after_spinlock():
174 smp_mb__after_spinlock();
187 This addition of smp_mb__after_spinlock() strengthens the lock acquisition
Dordering.txt160 o smp_mb__after_spinlock(), which provides full ordering subsequent
Dexplanation.txt2558 smp_mb__after_spinlock(). The LKMM uses fence events with special
2570 smp_mb__after_spinlock() orders po-earlier lock acquisition
/Linux-v6.1/kernel/
Dkthread.c1465 smp_mb__after_spinlock(); in kthread_unuse_mm()
Dexit.c507 smp_mb__after_spinlock(); in exit_mm()
/Linux-v6.1/kernel/rcu/
Dtree_nocb.h947 smp_mb__after_spinlock(); /* Timer expire before wakeup. */ in do_nocb_deferred_wakeup_timer()
/Linux-v6.1/Documentation/RCU/
DwhatisRCU.rst636 smp_mb__after_spinlock();
662 been able to write-acquire the lock otherwise. The smp_mb__after_spinlock()
/Linux-v6.1/kernel/sched/
Dcore.c1769 smp_mb__after_spinlock(); in uclamp_sync_util_min_rt_default()
4077 smp_mb__after_spinlock(); in try_to_wake_up()
6439 smp_mb__after_spinlock(); in __schedule()