Freeing C array memory from fast

var cpuInfo: processor_info_array_t = nil
var numCpuInfo: mach_msg_type_number_t = 0
var coresTotalUsage: Float = 0.0
var numCPUsU: natural_t = 0
let err = host_processor_info(mach_host_self(), PROCESSOR_CPU_LOAD_INFO, &numCPUsU, &cpuInfo, &numCpuInfo);
assert(err == KERN_SUCCESS, "Failed call to host_processor_info")

      

Hi I am calling the above C API host_processor_info

to get process loading information from swift, no problem. cpuInfo

is an inout parameter (pointer), which in turn will point to a structure containing the CPU information allocated by this API. The caller is responsible for freeing memory; I can do it easily from the C lens, but no luck quickly. I know I could incorporate this call into C object extension, but I'm trying to learn quickly and would like to avoid obj-c's solution if possible.

in obj-c I would free with:

size_t cpuInfoSize = sizeof(integer_t) * numCpuInfo;
vm_deallocate(mach_task_self(), (vm_address_t) cpuInfo, cpuInfoSize)

      

cpuInfo in swift is UnsafeMutablePointer not convertible to vm_address_t.

Any thanks, thanks.

+3


source to share


2 answers


processor_info_array_t

is a pointer type, and vm_address_t

is an integer type (ultimately an alias for UInt

). (Judging from the comments, <i386/vm_types.h>

this may be for historical reasons.) The only way to convert a pointer to an integer (of the same size) in Swift is unsafeBitCast

.

mach_init.h

defines

extern mach_port_t      mach_task_self_;
#define mach_task_self() mach_task_self_

      



In Swift, only the external variable is visible, not the macro.

This gives:

let cpuInfoSize = vm_size_t(sizeof(integer_t)) * vm_size_t(numCpuInfo)
vm_deallocate(mach_task_self_, unsafeBitCast(cpuInfo, vm_address_t.self), cpuInfoSize)

      

+5


source


In Swift 4, the equivalent code looks like this:

let cpuInfoSize = vm_size_t(MemoryLayout<integer_t>.stride * Int(numCpuInfo))
vm_deallocate(mach_task_self_, vm_address_t(bitPattern: cpuInfo), cpuInfoSize)

      



In particular, the initializer UInt(bitPattern:)

now seems to prefer to unsafeBitCast()

initialize an unsigned integer with a pointer bitmap (I assume this usage is no longer considered "unsafe"). It processes the pointer correctly nil

, in this case it returns 0.

+1


source







All Articles