Mutex queue

First, the mutex:

It is an example of a grab votes: a data file in the discharge line of the number of votes { "number": 1} 1 votes

from multiprocessing Import Process, Lock
 Import JSON
 Import Time 

DEF SET (I): 
    the DATA_READ = the json.load (Open ( ' info_user.txt ' , ' R & lt ' , encoding = ' UTF-. 8 ' ))   # first check votes, all can be viewed with 
    the time.sleep (2)   # analog read data network delay 
    Print ( ' % s% s Access votes ' % (I, the DATA_READ [ ' Number ' ]))
 DEF Take (I): 
    data = JSON .load (Open ( 'info_user.txt ' , ' R & lt ' , encoding = ' UTF-. 8 ' ))
     IF Data [ ' Number ' ]> 0: 
        Data [ ' Number ' ] = Data [ ' Number ' ] -1 
        the time.sleep ( 2)    # analog network delaying the write data 
        the json.dump (data, Open ( ' info_user.txt ' , ' W ' , encoding = ' UTF-. 8 ' ))
         Print ( 'Ticket% s success' % i)

def run(i):
    set(i)
    take(i)



if __name__ == '__main__':

    for i in range(10):
        p = Process(target=run,args=(i,))
        p.start()


Results:'ll find all of them got the tickets, but only one ticket, although the high efficiency multi-process data but confusion unsafe, so this mutex is to solve the problem of unsafe data confusion

1 ticket 
1 tickets 
have a ticket 
1 tickets 
have a ticket 
1 tickets 
have a ticket 
1 tickets 
have a ticket 
has a ticket 1 
0 successful purchase
 a purchase success 
 2 available successful votes 
 3 tickets successfully 
 4 tickets success 
 5 ticket success 
 6 purchase success 
 7 tickets successfully 
 8 purchase success 
 9 successful purchase

Solution: Here plus the mutex, give up efficiency to ensure the security of data

from multiprocessing Import Process, Lock
 Import JSON
 Import Time 

DEF SET (I): 
    the DATA_READ = the json.load (Open ( ' info_user.txt ' , ' R & lt ' , encoding = ' UTF-. 8 ' ))   # first check votes, all can be viewed with 
    the time.sleep (2)   # analog read data network delay 
    Print ( ' % s% s Access votes ' % (I, the DATA_READ [ ' Number ' ]))
 DEF Take (I): 

    data = JSON .load (Open ( 'info_user.txt ' , ' R & lt ' , encoding = ' UTF-. 8 ' ))
     IF Data [ ' Number ' ]> 0: 
        Data [ ' Number ' ] = Data [ ' Number ' ] -1 
        the time.sleep ( 2)    # analog network delaying the write data 
        the json.dump (data, Open ( ' info_user.txt ' , ' W ' , encoding = ' UTF-. 8 ' ))
         Print ( 'Ticket% s success' % I) 

DEF RUN (I, Lock): 
    SET (I) 
    lock.acquire ()      # here may also be used with lock: corresponds lock.acquire (), from executing the code blocks is performed automatically lock.release () 
    Take (I) 
    lock.release () 



IF  the __name__ == ' __main__ ' : 
    Lock = Lock ()
     for I in Range (10 ): 
        P = Process (target = RUN, args = (I, Lock)) 
        p.start ()

result:

Access votes 0 1
 2 Access votes 1
 3 Access ticket 1
 1 Access 1 votes
 4 Access votes 1
 6 Access votes 1
 5 Access 1 tickets
 8 Access votes 1
 7 View 1 votes
 9 View 1 votes 
0 ticketing success

Second, the pipe queue

Queues and pipes: pipes and queue data are stored in memory, and the queue is based on the (pipeline + lock) to achieve, let us free from the lock complex problem, and thus inter-queue process is the most communication good choice

obj = Queue (max_size) defined method, max_size limit is the number of queue

But: the queue is occupied memory space, max_size is limited by memory size

  Data can not be put in the queue is large data

# Q.put method for inserting data into the queue. 
# Q.get method may be read from the queue and removes an element. 
from multiprocessing Import Queue 
Q = Queue (. 3 ) 

q.put ( ' 1234 ' ) 
q.put ([ l, 2,3, ' E ' ]) 
q.put ( . 1 )
 Print (q.full ())   # View whether the queue is full 

Print (q.get ())
 Print (q.get ())
 Print (q.get ())
 Print (q.empty ())   # is already empty 

# Print (q.get ( )) # after the queue to get no value, and died here, blocking

 

Guess you like

Origin www.cnblogs.com/whileke/p/11461645.html