profile
viewpoint
Martin Mlostek mlostekk Nomad5 Berlin, Germany http://www.nomad5.com

mlostekk/AudioKitSampleRate 1

showcasing the issue with a changeing samplerate leading to a pitched down audio playback

mlostekk/AFNetworking 0

A delightful networking framework for iOS, macOS, watchOS, and tvOS.

mlostekk/android-xml-sorter 0

Android Studio & IntelliJ Plugin for sort xml by name="xxx".

mlostekk/AudioKit 0

Swift audio synthesis, processing, & analysis platform for iOS, macOS and tvOS

mlostekk/BartyCrouch 0

Localization/I18n: Incrementally update your Strings files from .swift, .h, .m, .storyboard or .xib files and/or use machine-translation.

mlostekk/bluebird 0

:bird: :zap: Bluebird is a full featured promise library with unmatched performance.

mlostekk/haxe 0

Haxe - The Cross-Platform Toolkit

mlostekk/MIKMIDI 0

MIDI Library for Objective-C and Swift OS X and iOS apps.

push eventLearnfield-GmbH/CodingChallenge

Martin 'Arkardiusz' Mlostek

commit sha eb90eaebf938a6465eda4631c7d906a4d458c699

fixed randomizing function

view details

push time in 10 days

issue openedFlinesoft/BartyCrouch

Possible to use default translation for all langauges

I am using the "transform" with this setup

[update]
tasks = ["transform", "normalize"]

[update.interfaces]
path = "Wynton/"
defaultToBase = true
ignoreEmptyStrings = true
unstripped = false

[update.code]
codePath = "Wynton/"
localizablePath = "."
defaultToKeys = true
additive = false
unstripped = false

[update.transform]
codePath = "Wynton/"
localizablePath = "Wynton/Localization"
transformer = "swiftgenStructured"
supportedLanguageEnumPath = "Wynton/Localization"
typeName = "BartyCrouch"
translateMethodName = "translate"

[update.normalize]
path = "Wynton/Localization"
sourceLocale = "en"
harmonizeWithSource = true
sortByKeys = false

[lint]
path = "Wynton/Localization"
duplicateKeys = true
emptyValues = true

Now when i create new localized text like this

        textView.text = BartyCrouch.translate(key: "something went wront",
                                              translations: [.english: "Something went wrong"],
                                              comment: "Something went wrong message bla bla")

Then this value only shows up in the english translation. But is it possible to make a default text to all other languages or to use the base langauge (like en here) for all languages.

For QA its a bit strange if the text is empty and not something obviously wrong. Because if they test the app, then they just dont see a label e.g. and think the screen is fine. But if there would be a default string like <untranslated> or just the english translation (e.g. in an chinese app) then they would instantly see that something is buggy/not yet translated.

Is there a way to set a default string when generating a new entry in the strings files?

created time in 3 months

issue commentsuperpoweredSDK/Low-Latency-Android-iOS-Linux-Windows-tvOS-macOS-Interactive-Audio-Platform

Is it possible to use superpowered with the iOS13 style MPMediaPickerController?

Mh, when i try it (its a bit older version of superpowered) then I will get an

SuperpoweredAdvancedAudioPlayerEvent_LoadError

The asset url looks like this

ipod-library://item/item.m4a?id=8956556064155754225 which should be fine. I have tested the example above, and the same asset url seems to work fine.

Was there a required update to your library? (I cant just update, still in evaluation phase. its a pretty old app)

mlostekk

comment created time in 4 months

issue commentsuperpoweredSDK/Low-Latency-Android-iOS-Linux-Windows-tvOS-macOS-Interactive-Audio-Platform

Is it possible to use superpowered with the iOS13 style MPMediaPickerController?

Okay thanks for the example. I assume this is not working for streamed objects from the cloud?

mlostekk

comment created time in 4 months

issue openedsuperpoweredSDK/Low-Latency-Android-iOS-Linux-Windows-tvOS-macOS-Interactive-Audio-Platform

Is it possible to use superpowered with the iOS13 style MPMediaPickerController?

Apple did some changes with their way to retrieve the "local" files that are in the apple music library. I am only referring to the once that are available on the local device, not streaming.

Is it possible to access those files' buffers and let superpowered run with them?

created time in 4 months

issue commentmixpanel/mixpanel-iphone

UndefinedBehaviorSanitizer: undefined-behavior

@zihejia did you have a change to look into?

mlostekk

comment created time in 5 months

delete branch mlostekk/AFNetworking

delete branch : patch-1

delete time in 5 months

IssuesEvent

issue commentmixpanel/mixpanel-iphone

UndefinedBehaviorSanitizer: undefined-behavior

Oh. Well i have forgot to recompile. This is still happening. See the attached screenshot

Steps to reproduce

  1. Enable the UndefinedBehaviorSanitizer
  2. Have an app with a basic usage of MPTweakValue(@"subscriptionOffer", @"default“)
  3. Let it run on an ios device (ios12 or 13).
  4. See the sanitizer breakpoint insdie MPTweakValue.m

Its because the pointer *entry->encoding is a nullpointer. This leads to the undefined behaviour

our usage of the tweak values looks like this

#define EXPERIMENT_IDENTIFIER @"identifier"
#define EXPERIMENT_VALUE @"value"
#define EXPERIMENTS_LOADED @"Experiments Loaded"
#define EXPERIMENTS_FAILED_LOADING @"Experiments Failed Loading“

RCT_EXPORT_METHOD(getExperiments:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject)
{
    // Start Timing the loading of the experiments
    [[Mixpanel sharedInstance] timeEvent: EXPERIMENTS_LOADED];
    [[Mixpanel sharedInstance] joinExperimentsWithCallback:^
    {
        //Build an Array with the Fetched or Default experiment values and their identifiers.
        NSArray *experiments = @[@{EXPERIMENT_IDENTIFIER   : @"stuffA",
                                   EXPERIMENT_VALUE: @(MPTweakValue(@"stuffA", NO)),
                                 },
                                 ...
                                 @{EXPERIMENT_IDENTIFIER   : @"stuffB",
                                   EXPERIMENT_VALUE: @(MPTweakValue(@"stuffB", NO)),
                                 }];                                          
        if (experiments) {
            [[Mixpanel sharedInstance] track:EXPERIMENTS_LOADED];
            resolve(experiments);
        }else {
            NSError *error;
            [[Mixpanel sharedInstance] track:EXPERIMENTS_FAILED_LOADING];
            reject(@"no_experiments", @"There were no experiments", error);
        }
    }];
}

Bildschirmfoto 2019-10-15 um 19 03 41

mlostekk

comment created time in 6 months

issue commentmixpanel/mixpanel-iphone

UndefinedBehaviorSanitizer: undefined-behavior

I will bring it up if i find the issue again

mlostekk

comment created time in 6 months

issue closedmixpanel/mixpanel-iphone

UndefinedBehaviorSanitizer: undefined-behavior

Integration Method: Manual Xcode Version: Version 10.3 (10G8) Library Version: v3.4.5 Platform: iOS Language: Objective-C Description: When enabling the UndefinedBehaviorSanitizer i can see the following issue popping up

/Project/Mixpanel/MPTweakInline.m:21:60: runtime error: load of null pointer of type 'char *'
SUMMARY: UndefinedBehaviorSanitizer: undefined-behavior /Project/Mixpanel/MPTweakInline.m:21:60 in 
(lldb) 

We use the MPTweakValue like this

MPTweakValue(@"subscriptionOffer", @"default")

Expected Behavior: I would expect either the encoding to be set, or the null pointer to be handled properly. With this setup i cannot run the UndefinedBehaviorSanitizer because mixpanel at the very start is already blocking it.

closed time in 6 months

mlostekk

issue commentmixpanel/mixpanel-iphone

UndefinedBehaviorSanitizer: undefined-behavior

Hi @zihejia

We have upgraded mixpanel to the latest version and the issue seems to be gone.

But for your information. The issue was happening when enablig the UndefinedBehaviorSanitizer inside

static MPTweak *_MPTweakCreateWithEntry(NSString *name, mp_tweak_entry *entry)
{
    NSString *encoding = [NSString stringWithFormat:@"%s", *entry->encoding];
    MPTweak *tweak = [[MPTweak alloc] initWithName:name andEncoding:encoding];
    ...

As the error describes above the entry->encoding was set to a null pointer. This was leading to the undefined behaviour

mlostekk

comment created time in 6 months

issue commentAudioKit/AudioKit

Failed to fill complex buffer in float converter

@IlyaBlinov

Would you mind formatting it properly? you can code format by using

` ` `  <- without whitespaces, around your code
ashishkeshan

comment created time in 6 months

issue commentAudioKit/AudioKit

iOS13 Crash - Error: Failed to fill complex buffer in float converter ('insz')

oh, had some formatting issues -> fixed.

Well yes, on ios 13 it leads to the same thing, but only on iphone7 (not ipadair2). This indicates that the default sample rate is wrongly set. (iphone7 uses 48k by default).

The iOS12 logs are something worth to mention maybe

mlostekk

comment created time in 6 months

issue openedAudioKit/AudioKit

iOS13 - AUBuffer.h:61:GetBufferList: EXCEPTION (-1) [mPtrState == kPtrsInvalid is false]: ""

I am using AudioKit 4.8 like this

class AudioKitWrapper {

    /// The main output mixer (after the amplitude tracker)
    private let masterMixer:  AKMixer

    /// The audio input
    private var microphone:   EZMicrophone?

    /// Construction
    init() {
        // global settings
        AKAudioFile.cleanTempDirectory()
        AKSettings.defaultToSpeaker = true
        AKSettings.enableRouteChangeHandling = true
        AKSettings.enableCategoryChangeHandling = true
        AKSettings.disableAVAudioSessionCategoryManagement = true
        AKSettings.audioInputEnabled = true
        AKSettings.playbackWhileMuted = false
        #if DEBUG
            AKSettings.enableLogging = AppConfig.AudioConfig.debugLogEnabled
        #endif
        // main mixer
        masterMixer = AKMixer()
    }

    /// Start up audiokit
    func startEngine(with audioInput: Bool) throws {
        // connect main nodes
        AudioKit.output = masterMixer
        // input
        microphone?.delegate = nil
        microphone = nil
        if audioInput {
            AKSettings.enableEchoCancellation = true
            let sizeOfFloat = UInt32(MemoryLayout<Float>.stride)
            microphone = EZMicrophone(microphoneDelegate: self,
                                      with: AudioStreamBasicDescription(mSampleRate: Float64(mirgaFactory.getSampleRate()),
                                                                        mFormatID: kAudioFormatLinearPCM,
                                                                        mFormatFlags: kLinearPCMFormatFlagIsFloat |
                                                                                      kAudioFormatFlagsNativeEndian |
                                                                                      kAudioFormatFlagIsPacked |
                                                                                      kAudioFormatFlagIsNonInterleaved,
                                                                        mBytesPerPacket: sizeOfFloat,
                                                                        mFramesPerPacket: 1,
                                                                        mBytesPerFrame: sizeOfFloat,
                                                                        mChannelsPerFrame: 1,
                                                                        mBitsPerChannel: sizeOfFloat * 8,
                                                                        mReserved: 0),
                                      startsImmediately: false)
        }
        // start
        try AKSettings.session.setCategory(.playAndRecord, mode: .measurement, options: .defaultToSpeaker)
        try AudioKit.start()
        microphone?.startFetchingAudio()
        Log.d("~~~ AudioKit: started")
    }

    /// Stop engine
    func stopEngine() throws {
        try AudioKit.stop()
        microphone?.stopFetchingAudio()
        Log.d("~~~ AudioKit: stopped")
    }

    /// Attach output
    func attach(audioPlayer: AKAudioPlayer) {
        audioPlayer >>> masterMixer
    }

    /// Reset
    func reset() {
        masterMixer.detach()
        Log.d("~~~ AudioKit: reset")
    }

}

/// Handle EZMicrophone input
extension AudioKitWrapper: EZMicrophoneDelegate {

    /// Playing state changed
    func microphone(_ microphone: EZMicrophone!, changedPlayingState isPlaying: Bool) {
        Log.i("playing state - \(isPlaying)")
    }

    /// Device changed
    func microphone(_ microphone: EZMicrophone!, changedDevice device: EZAudioDevice!) {
        Log.i("device - \(String(describing: device))")
    }

    /// Audio stream description
    func microphone(_ microphone: EZMicrophone!, hasAudioStreamBasicDescription audioStreamBasicDescription: AudioStreamBasicDescription) {
        Log.i("stream - \(audioStreamBasicDescription)")
    }

    /// Input buffer handling
    func microphone(_ microphone: EZMicrophone!,
                    hasAudioReceived buffer: UnsafeMutablePointer<UnsafeMutablePointer<Float>?>!,
                    withBufferSize bufferSize: UInt32,
                    withNumberOfChannels numberOfChannels: UInt32,
                    atTime timestamp: UnsafePointer<AudioTimeStamp>!) {
        writeSamplesIntoCircularBuffer(buffer[0]!, bufferSize, timestamp)
    }
}

And getting this error message in the log (i assume per audio frame)

AUBuffer.h:61:GetBufferList: EXCEPTION (-1) [mPtrState == kPtrsInvalid is false]: ""

created time in 6 months

issue commentAudioKit/AudioKit

Volume Control Issue on iPhone7

I assume AudioKit just amplifies this issue that is already underlying. The volume control is pretty weird when AK is used on top of this setup as described in the branch / link above

mlostekk

comment created time in 6 months

issue commentAudioKit/AudioKit

Volume Control Issue on iPhone7

to be honest, meanwhile i think its not audiokit related. I have created this SO question. The volume control issues seem to happen with the voiceProcessingIO.

https://github.com/mlostekk/AudioKitSampleRate/tree/issue/outputVolumeNoAudiokit (branch outputVolumeNoAudiokit)

mlostekk

comment created time in 6 months

push eventmlostekk/AudioKitSampleRate

Martin 'Arkardiusz' Mlostek

commit sha de11c20ab471097348d8490ad4da1f7d4ff21be6

changed mode to default

view details

Martin 'Arkardiusz' Mlostek

commit sha 589f7b861b5e5df62e5b174ae16e7d358d372de5

intellij update

view details

push time in 6 months

create barnchmlostekk/AudioKitSampleRate

branch : issue/outputVolumeNoAudiokit

created branch time in 6 months

issue openedAudioKit/AudioKit

Volume Control Issue

Hey there.

We are using the microphones input to do some DSP and AudioKit for output. Now with the current setup we are having some volume issues. So pulling down the volume does not always mute it. On some devices (mostly iphone7 and above) we are experiencing extremely loud playback from AudioKit.

Any idea what the reasons could be. Here is the demo project. It would work fine for lower devices (until iphone6splus) but starting from iphone7 the volume is highly unpredictable.

https://github.com/mlostekk/AudioKitSampleRate/tree/issue/volumeControl

The branch is issue/volumeControl

created time in 6 months

push eventmlostekk/AudioKitSampleRate

Martin 'Arkardiusz' Mlostek

commit sha 59f1cb4fc99c7ed0ef3645d6c1e3c3ae611fe119

volume control issue finalized

view details

push time in 6 months

issue commentReactiveX/RxSwift

ITMS-90809: Deprecated API Usage - UIWebView

Apps are approved just fine! Apple will send out a email to developers shortly!

download

voyage11

comment created time in 6 months

delete branch mlostekk/AudioKit

delete branch : patch-1

delete time in 6 months

create barnchmlostekk/AudioKitSampleRate

branch : issue/volumeControl

created branch time in 6 months

pull request commenttensorflow/tensorflow

fixed __thread define usage for i386 build

@rthadur okay, was this issue adressed in 2.0?

mlostekk

comment created time in 6 months

PR opened AudioKit/AudioKit

added documentation

I had to debug the framework just to find out that for proper scheduling the player may not be in the paused state. Just added some comments

+2 -0

0 comment

1 changed file

pr created time in 6 months

push eventmlostekk/AudioKit

Martin Mlostek

commit sha d15d8513b8e51ae21b915f2699775b9fc43f1d1a

added documentation I had to debug the framework just to find out that for proper scheduling the player may not be in the paused state. Just added some comments

view details

push time in 6 months

issue commentTheAmazingAudioEngine/RealtimeWatchdog

No effect. Library still working?

Oh okay. But it should find out regular c mallocs right?

mlostekk

comment created time in 7 months

issue openedtensorflow/tensorflow

Is tensorflow lite malloc free?

I read that TFL Micro is malloc free, but is Tensorflow Lite also malloc free? I wondering if its safe to use inside an audio thread?

created time in 7 months

more